Theorist Tord Riemann, who made key contributions to e+e– collider phenomenology, left us on 2 April.
Tord was born in 1951 in East Berlin, educated at the Heinrich-Hertz-Gymnasium specialist mathematics school in Berlin and studied physics at Humboldt University in Berlin from 1970. He graduated in 1977 with a doctorate devoted to studies of the lattice approach to quantum field theory. He obtained a research position in the theory group of the Institute of High Energy Physics of the Academy of Sciences of the GDR in Zeuthen (later DESY Zeuthen), and in 1983–1987 worked at JINR, then in the Soviet Union, in the group of Dmitry Bardin.
In 1989/1990 Tord visited the L3 experiment at CERN, starting a fruitful collaboration on the application of the ZFITTER project at the Large Electron–Positron (LEP) collider. In 1991–1992 he was a research associate in the CERN theory division, working out the so-called S-matrix approach to the Z resonance. This was a profound contribution to the field, and a breakthrough for the interpretation of LEP data. Tord was one of the first to realise the great potential of a new e+e– “Tera-Z” factory at the proposed Future Circular Collider, FCC-ee, and led the charge reviving precision calculations for it.
Tord’s scientific fields of interest were broad
Tord’s scientific fields of interest were broad, and aimed at predicting observables measured at accelerators. His research topics included linear-collider physics; Higgs, WW, ZZ, 2f and 4f production in e+e– scattering; physics at LEP and FCC-ee; methods in the calculation of multi-loop massive Feynman integrals; NNLO Bhabha scattering in QED; higher-order corrections in the electroweak Standard Model and some extensions; and electroweak corrections for deep inelastic scattering at HERA. Apart from ZFITTER, he co-authored several programmes, including topfit, GENTLE/4fan, HECTOR, SMATASY, TERAD91, DISEPNC, DISEPCC, DIZET, polHeCTOR and AMBRE.
While being an active research scientist throughout his career, Tord will also be warmly remembered as a great mentor to many of us. He was a thesis advisor for two diploma and seven PhD students, and was actively engaged in supporting many postdoctoral researchers. He was co-founder and organiser of the bi-annual workshop series Loops and Legs in Quantum Field Theory and of the biannual DESY school Computer Algebra and Particle Physics.
In 2000, Tord and the ZFITTER collaboration were awarded the First Prize of JINR, and in 2014 the article “The ZFITTER Project” was awarded the JINR prize for the best publication of the year in Physics of Elementary Particles and Nuclei. In 2015 Tord was awarded an Alexander von Humboldt Polish Honorary Research Fellowship.
Tord Riemann cared about high standards in scientific research, including ethical issues. He was a true professional of the field. Despite illness, he continued working until his last day.
Tord was an outstanding scientist, a just person of great honesty, a reliable friend, colleague and family man. We feel a great loss, personally and as a scientific community, and remain thankful for his insights, dedication and all the precious moments we have shared.
CERN’s international relationships are central to its work, and a perfect example of nations coming together for the purpose of peaceful research, regardless of external politics. Through working in China during the 1980s and the Soviet Union/Russia in the early 1990s, physicist Paul Lecoq’s long career is testament to CERN’s influence and standing.
Originally interested in astrophysics, Lecoq completed a PhD in nuclear physics in Montreal in 1972. After finishing his military service, during which he taught nuclear physics at the French Navy School, he came across an advertisement for a fellowship position at CERN. It was the start of a 47-year-long journey with the organisation. “I thought, why not?” Lecoq recalls. “CERN was not my initial target, but I thought it would be a very good place to go. Also, I liked skiing and mountains.”
Royal treatment
During his third year as a fellow, a staff position opened for the upcoming European Hybrid Spectrometer (EHS), which would test CERN’s potential for collaboration beyond its core member states. “The idea was to make a complex multi-detector system, which would be a multi-institute collaboration, with each institute having the responsibility to build one detector,” says Lecoq. One of these institutes was based in Japan, allowing the exchange of personnel. Lecoq was one of the first to benefit from this agreement and, thanks to CERN’s already substantial image, he was very well-received. “At the time, people were travelling much less than now, and Japan was more isolated. I was welcomed by the president of the university and had a very nice reception almost every day.” It was an early sign of things to come for Lecoq.
During the lifetime of the EHS, a “supergroup” of CERN staff was formed whose main role was to support partners across the world while also building part of the experiment. By the time the Large Electron–Positron Collider (LEP) came to fruition it was clear that it would also benefit from this successful approach. At that time, Sam Ting had been asked to propose an experiment for LEP by then Director-General Herwig Schopper, which would become the L3 experiment, and with the EHS coming to an end, says Lecoq, it was natural that the EHS supergroup was transferred to Ting. Through friends working in material science, Lecoq caught wind of the new scintillator crystal (BGO) that was being proposed for L3 – an idea that would see him link up with Ting and spend much of the next few years in China.
BGO crystals had not yet been used in particle physics, and had only existed in a few small samples, but L3 needed more than 1 m3 of coverage. After sampling and testing the first crystal samples, Lecoq presented his findings at an L3 collaboration meeting. “At the end of the meeting, Ting pointed his finger in my direction and asked if I was free on Saturday. I responded, ‘yes sir’. Then he turned to his secretary and said, ‘book a flight ticket to Shanghai – this guy is coming with me!’”
This is something unique about CERN, where you can meet fantastic people that can completely change your life
Unknown to Lecoq upon his arrival in China, Ting had already prepared the possibility to develop the technology for the mass production of BGO crystals there, and wanted Lecoq to oversee this production. BGO was soon recognised as a crystal that could be produced in large quantities in a reliable and cost-effective way, and it has since been used in a generation of PET scanners. Lecoq was impressed by the authority Ting held in China. “The second day we were in China, we, well Ting, had been invited by the mayor of Shanghai for a dinner to discuss the opportunity for the experiment.” The mayor was Jiang Zemin, who only a few years later became China’s president. “I have been very lucky to have several opportunities like this in my career. This is something unique about CERN, where you can meet fantastic people that can completely change your life. It was also an interesting period when China was slowly opening up to the world – on my first trip everyone was in Mao suits, and in the next three to five years I could see a tremendous change that was so impressive.”
Lecoq’s journeyman career did not stop there. With LEP finishing towards the turn of the millennium and LHC preparations in full swing, his expertise was needed for the production of lead tungstate (PWO) crystals for CMS’s electromagnetic calorimeter. This time, however, Russia was the base of operations, and the 1.2 m3 of BGO crystal for L3 became more than 10 m3 of PWO for CMS. As with his spell in China, Lecoq was in Russia during a politically uncertain time, with his arrival shortly following the fall of the Berlin Wall. “There was no system anymore. But there was still very strong intellectual activity, with scientists at an incredible level, and there was still a lot of production infrastructure for military interest.”
It was interesting not only at the scientific level, but on a human level too
At the time, lithium niobate, a crystal very similar to PWO, was being exploited for radar communication and missile guidance, says Lecoq, and the country had a valuable (but unknown to the public) production-infrastructure in place. With the disarray at the end of the Cold War, the European Commission set up a system, along with Canada, Japan and the US, called the International Science and Technology Center (ISTC), whose role was to transfer the Soviet Union’s military industry into civil application. Lecoq was able to meet with ISTC and gain €7 million in funding to support PWO crystal production for CMS. Again, he stresses, this only happened due to the stature of CERN. “I could not have done that if I had been working only as a French scientist. CERN has the diplomatic contact with the European Commission and different governments, and that made it a lot easier.” Lecoq was responsible for choosing where the crystal production would take place. “These top-level scientists working in the military areas felt isolated, especially in a country that was in a period of collapse, so they were more than happy not only to have an opportunity to do their job under better conditions, but also to have the contacts. It was interesting not only at the scientific level, but on a human level too.”
Crystal clear
Back at CERN, Lecoq realised that introducing a new scintillating crystal, optimising its performance to the harsh operating conditions of the LHC, and developing mass-production technologies to produce large amounts of crystal in a reliable and cost-effective way, was a formidable challenge that could not be dealt with only by particle physicists. Therefore, in 1991, he decided to establish the Crystal Clear multidisciplinary collaboration, gathering experts in material science, crystal-growth, luminescence, solid-state physics and beyond. Here again, he says, the attractiveness of CERN as an internationally recognised research centre was a great help to convince institutes all over the world, some not connected to particle physics at all, to join the collaboration. Crystal Clear is still running today, and celebrating its 30th anniversary.
Through developing international connections in unexpected places, Lecoq’s career has helped build sustained connections for CERN in some of the world’s largest and fruitfully scientific places. Now retired, he is a distinguished professor at the Polytechnic University in Valencia, where he has set up a public–private partnership laboratory for metamaterial-based scintillators and photodetectors, to aid a new generation of ionisation radiation detectors for medical imaging and other applications. Even now, he is able to flex the muscles of the CERN model by keeping in close contact with the organisation.
“My career at CERN has been extremely rich. I have changed so much in the countries I’ve worked with and the scientific aspect, too. It could only have been possible at CERN.”
Somewhat accidentally, because I didn’t even know that being a researcher in physics was a thing you could be until my second year of university. It was around then that I realised that someone like me could ask questions that didn’t have answers. That hooked my interest. My first project was in nuclear physics, and it involved using a particle accelerator for an experiment. I then attended the CERN summer student programme, working on ATLAS, which was my first proper exposure to the technology of particle physics. When it came to the time to do my PhD in around 2006, I had the choice to either stay in Melbourne to do particle physics, or go to Oxford, which had a strong accelerator programme. When I learned they were designing accelerators for cancer treatment, it blew my mind! I took the leap and decided to move to the other side of the world.
What did you do as a postdoc?
I was lucky enough to get an 1851 Royal Commission Fellowship, which allowed me to start an independent research programme. It was a bit of a baptism of fire, as I had been working on medical machines but then moved to high-intensity proton accelerators. I was looking at fixed-field alternating gradient accelerators and their application to things like accelerator-driven reactors. After a while I found myself spending a lot of time building sophisticated simulations, and was getting a bit bored of computing. So I started a couple of collaborations with some teams in Japan – one of which was using ion traps to mimic the dynamics of particle beams at very high intensity. What I found really interesting is how beams behave at a fundamental level, and I am currently working on upgrading a small experiment called IBEX to test a new type of optics called non-linear integral optics, which is a focus of Fermilab at the moment.
And now you’re back in the medical arena?
Yes – a few years ago I started working with people from CERN and the UK on compact medical accelerators for low- and middle-income countries. Then in 2019 I felt the pull to return to Australia to grow accelerator physics there. They have accelerators and facilities but didn’t have a strong academic accelerator community, so I am building up a group at Melbourne University that has a medical applications focus, but also looks at other areas. After 20 years of pushing for a proton therapy centre here, the first one is now being built.
How and when did your career in science communication take off?
I was doing things like stage shows for primary-school children when I was a first-year undergraduate. I have always seen it as part of the process of being a scientist. Before my PhD I worked in a science museum and, while at Oxford, I started an outreach programme called Accelerate! that took live science shows to some 30,000 students in its first two years and is still running. From there, it sort of branched out. I did more public lectures, but also a little bit of TV, radio and some writing.
Any advice for physicists who want to get into communication?
You need to build a portfolio, and demonstrate that you have a range of different styles, delivery modes and use language that people understand. The other thing that really helped me was working with professional organisations such as the Royal Institution in London. It does take a lot of time to do both your research and academic job well, and also do the communication well. A lot of my communication is about my research field – so luckily they enrich each other. I think my communication has the potential to have a much bigger societal impact than my research, so I am very serious about it. The first time someone pointed a video camera at me I was terrified. Now I can say what I want to say. We shouldn’t underestimate how much the public wants to hear from real working scientists, so keeping a very strong research base keeps my authenticity and credibility.
What is your work/life balance like?
I am not a fan of the term “work/life balance” as it tends to imply that one is necessarily in conflict with the other. I think it’s important to set up a kind of work/life integration that supports well-being while allowing you to do the work you want to do. When I was invited back to Melbourne to build an accelerator group, I’d just started a new research group in Oxford. I stepped down my teaching and we agreed that I would take periods of sabbatical to spend time in Melbourne until I finished my experiment. I have been so incredibly grateful to everyone on both sides for their understanding. Key to that has been learning how other people’s expectations affect you and finding a way to filter them out and drive your own goals. Working in two completely different time zones, it would be easy to work ridiculously long days, so I have had to learn to protect my health. The hardest thing, and I think a lot of early/mid-career researchers will relate to this, is that academia is an infinite job: you will never do enough for someone to tell you that you have done enough. The pressure always feels like it’s increasing, especially when you are a post-doc or on tenure track, or in the process of establishing a new group or lab. You have to learn how to take care of your mental health and well-being so that you don’t burn out. With everything else that’s going on in the world right now, this is even more important.
You are active in trying to raise the profile of women in physics. What does this involve on a practical level?
There has been a lot of focus for many years in getting more women into subjects like physics. My view is that whenever I meet young people they’re interested already. In many countries the gender balance at undergraduate level is similar. So what’s happening instead is that we are pushing women and minorities out. My focus, within my sphere of influence, is to make sure that the culture that I am perpetuating and the values that I hold within my research groups are well defined and communicated.
I kind of pulled back from active engagement in panel sessions and things like that a number of years ago, because I realised that the most important way I can contribute is by being the best scientist that I can be. The fact that I happen to have a public profile is great in that it makes people aware that people like me exist. One of the things that has helped me the most is to build a really great community of peers of other women in physics. I think for the first seven or eight years of my career, when imposter syndrome was strong and I questioned if I fitted in, I realised that I didn’t have a single direct female colleague. With most people in my field being men, it’s likely that when choosing a speaker, for example, the first person we think of is male. Taking time to be well-networked with women in the field is incredibly important in that regard. Today, I find that creating the right environment means that people will seek out my research group because they hear it’s a nice place to be. Students today are much savvier with this stuff – they can tell toxic professors a mile away. I am trying to show them that there is a way of doing research that doesn’t involve the horrible sides to it. Research is hard enough already, so why make it harder?
Tell us about your debut book The Matter of Everything?
It’s published by Bloomsbury (UK/Commonwealth) and Knopf (US) and is due out in early 2022. Its subtitle is “The 12 experiments that made the modern world”, starting with the cathode-ray tube and going all the way through to the LHC and what might come next. It’s told from the perspective of an experimental physicist. What isn’t always captured in popular physics books is how science is actually done, but it’s very human to feel like you’re failing in the lab. I also delve into what first interested me in accelerators, specifically the things that have emerged unexpectedly from these research areas. People think that Apple invented everything in the iPhone, but if it wasn’t for curiosity-driven physics experiments then it wouldn’t be possible. On a personal note, as I went through these stories in the field, often in the biographies and the acknowledgments, I would end up going down these rabbit holes of women whose careers were cut short because they got married and had to quit their job. It’s been lovely to have the opportunity to learn that these women were there, and it wasn’t just white men.
You have to learn how to take care of your mental health and well-being so that you don’t burn out
Do you have a preference as to which collider should come next after the LHC?
I think it should be one of the linear ones. The size of future circular colliders and the timescales involved are quite overwhelming, and you have to wonder if the politics might change throughout the project. A linear machine such as the ILC is more ready to go, if the money and will was there. But I also think there is value in the diversity of the technology. The scaling of SLAC’s linear electron machine, for example, really pushed the industrialisation of that accelerator technology – which is part of the reason why we have 3 GHz electron accelerators now in every hospital. There will be other implications to what we build, other than physics results – even though the decisions will be made on the physics.
What do you say to students considering a career in particle physics?
I will answer that from the perspective of the accelerator field, which is very exciting. If you look historically, new technologies have always driven new discoveries. The accelerator field is going through an interesting “technology discovery phase”, for example with laser-driven plasma accelerators, so there will be huge changes to what we are doing in 10–15 years’ time that could blow the decisions surrounding future colliders out of the water. This happened in the 1960s in the era of proton accelerators, where suddenly there was a new technology and it meant you could build machines with a much higher energy with smaller magnets, and suddenly the people who took that risk were the ones who ended up pushing the field forward. I sometimes feel experimental and theoretical physicists are slightly disconnected to what’s going on with accelerator physics now. When making future decisions, people should attend accelerator conferences…it may influence their choices.
The Skobeltsyn Institute of Nuclear Physics (SINP) was established at Lomonosov Moscow State University (MSU) on 1 February 1946, in pursuance of a decree of the government of the USSR. SINP MSU was created as a new type of institute, in which the principles of integrating higher education and fundamental science were prioritised. Its initiator and first director was Soviet physicist Dmitri Vladimirovich Skobeltsyn, who was known for his pioneering use of the cloud chamber to study the Compton effect in 1923 – aiding the discovery of the positron less than a decade later.
It is no coincidence that SINP MSU was established in the immediate aftermath of the Second World War, following the first use of nuclear weapons in conflict. The institute was created on the basis that it would train personnel who would specialise in nuclear science and technology, after the country realised that there was a shortage of specialists in the field. Thanks to strong leadership from Skobeltsyn and one of his former pupils, Sergei Nikolaevich Vernov, SINP MSU quickly gained recognition in the country. As soon as 1949, the government designated it a leading research institute. By this time a 72 cm cyclotron was already in use, the first to be used in a higher education institute in the USSR.
Skobeltsyn and Vernov continued with their high ambitions as they expanded the facility to the Lenin Hills, along with other scientific departments in MSU. Proposed in 1949 and opened in 1953, the new building in Moscow was granted approval to build a set of accelerators and a special installation for studying extensive air showers (EASs). The first accelerator built there was a 120 cm cyclotron, and its first outstanding scientific achievement was the discovery by A F Tulinov of the so-called “shadow effect” in nuclear reactions on single crystals, which makes it possible to study nuclear reactions at ultra-short time intervals. Significant scientific successes were associated with the commissioning of a unique installation, the EAS-MSU, at the end of the 1950s for the study of ultra-high-energy cosmic rays. Several results were obtained through a new method for studying EASs in the region of 1015–1017 eV, leading to the discovery of the famous “knee” in the energy spectrum of primary cosmic rays.
The space race
1949 marked SINP MSU’s entrance into astrophysics and, in particular, satellite technology. The USSR’s launch of Sputnik 1, Earth’s first artificial satellite, in 1957 gave Vernov, an enthusiastic experimentalist who had previously researched cosmic rays in the Earth’s stratosphere, the opportunity to study outer-atmosphere cosmic rays. This led to the installation of a Geiger counter on the Sputnik 2 satellite and a scintillation counter on Sputnik 3, to enable radiation experiments. Vernov’s experiments on Sputnik 2 enabled the first detection of the outer radiation belt. However, this was not confirmed until 1958 by the US’s Explorer 1, which carried an instrument designed and built by James Van Allen. Sputnik 3 confirmed the existence of an inner radiation belt, having received information from Australia and South America, as well as from sea-based stations.
Vernov, who was Skobeltsyn’s successor as SINP director in 1960–1982, later worked on the “Electron” and “Proton” series of satellites, which studied the radiation-belt structure, energy spectra and temporal variations associated with geomagnetic activity. This led to pioneering results on the spectrum and composition of galactic cosmic rays, and to the first model of radiation distribution in near-Earth space in the USSR.
SINP MSU has carried on Vernov’s cosmic legacy by continuing to develop equipment for satellites. Since 2005 the institute has developed its own space programme through the university satellites Tatiana-Universitetsky and Tatiana-2, as well as the Vernov satellite. These satellites led to new phenomena such as ultraviolet flashes from the atmosphere being discovered. In 2016 a tracking system for ultraviolet rays was installed on board the Lomonosov satellite (see “Vernov’s legacy” image), developed at SINP MSU under the guidance of former director Mikhail Igorevich Panasyuk. This allowed fluorescence light radiated by EASs of ultra-high-energy cosmic rays to be measured for the first time, and prompt-emission observations of multi-wavelength gamma-ray bursts. The leading role of the entire mission of the Lomonosov satellite belongs to the current rector of MSU, Victor Sadovnichy.
High-energy exploration
In 1968, under strong endorsement by Vernov and the director of a new Russian accelerator centre in Protvino, Anatoly Alekseyevich Longunov (who went on to be MSU rector from 1977 to 1991), a department of high-energy physics was established under the leadership of V G Shevchenko at SINP MSU, and the following year it was decided that a high-energy laboratory would be established at MSU. Throughout the years to follow, collaborations with laboratories in USSR and across the world, including CERN, Fermilab, DESY and the Joint Institute for Nuclear Research (JINR), lead the department to be at the forefront of the field.
At the end of the 1970s a centre was created at SINP MSU for bubble-chamber film analysis. At the time it was one of the largest automated complexes for processing and analysing information from large tracking detectors in the country. In collaboration with other institutes worldwide, staff at the institute studied soft hadronic processes in the energy range 12–350 GeV at a number of large facilities, including the Mirabelle Hydrogen Bubble Chamber and European Hybrid Spectrometer.
Extensive and unique experimental data have been obtained on the characteristics of multiple hadron productions, including fragmentation distributions. Throughout the years, exclusive reaction channels, angular and momentum correlations of secondary particles, resonance production processes and annihilation processes were also investigated. These results have made it possible to reliably test the predictions of phenomenological models, including the dual-parton model and the quark–gluon string model, based on the fundamental theoretical scheme of dual-topological unitarisation.
For the first time in Russia, together with a number of scientific and technical enterprises with the leading role of the SINP MSU, an integrated system has now been created for the development, design, mass production and testing of large silicon solid and microstrip detectors. On this basis, at the turn of the millennium a hadron–electron separator was built for the ZEUS experiment at HERA, DESY.
The institute delved into theoretical studies in 1983, with the establishment of the laboratory of symbolic computations in high-energy physics and, in 1990, the department of theoretical high-energy physics. One of its most striking achievements was the creation of the CompHEP software package, which has received global recognition for its ability to automate calculations of collisions between elementary particles and their decays within the framework of gauge theories. This is freely available and allows physicists (even those with little computer experience) to calculate cross sections and construct various distributions for collision processes within the Standard Model and its extensions. Members of the department later went on to make a significant contribution to the creation of a Tier-2 Grid computer segment in Russia for processing and storing data from the LHC detectors.
Over the past 35 years of research in the field of particle accelerators at SINP MSU, research has moved from the development of large accelerator complexes for fundamental research, to now focusing on the creation and production of applied accelerators for security systems, industry and medicine.
Teaching legacy
Throughout its 75 years, SINP MSU has also nurtured thousands of students. In 1961 a new branch of SINP MSU, the department for nuclear research, was established in Dubna. It became the basis for training students from the MSU physics faculty in nuclear physics using the capabilities of the largest international scientific centre in Russia – JINR. The department, which is still going strong today, teaches with a hands-on approach, with students attending lectures by leading JINR scientists and taking part in practical training held at the JINR laboratories.
The institute is currently participating in the upgrade of the LHC detectors (CMS, ATLAS, LHCb) for the HL-LHC project, as well as in projects within the Physics Beyond Colliders initiative (e.g. NA64, SHiP). These actions are under the umbrella of a 2019 cooperation agreement between CERN and Russia concerning high-energy physics and other domains of mutual interest. Looking even further ahead, SINP MSU scientists are also working on the development of research programmes for future collider projects such as the FCC, CLIC and ILC. Furthermore, the institute is involved in the upcoming NICA Complex in Russia, which plans to finish construction in 2022.
After 75 years, the institute is still as relevant as ever, and whatever the next chapter of particle physics will be, SINP MSU will be involved.
Experimental nuclear physicist Haiyan Gao has been appointed associate laboratory director for nuclear and particle physics at Brookhaven National Laboratory (BNL), beginning 1 June. Gao, whose research interests include the structure of the nucleon, searches for exotic QCD states and searches for new physics in electroweak interactions, is currently a professor of physics at Duke University, and has previously held positions at Argonne National Laboratory and Massachusetts Institute of Technology (MIT) . At BNL she replaces Dmitri Denisov, who has held the position on an interim basis after Berndt Mueller’s departure last year.
While at Duke, Gao was the founding vice chancellor for academic affairs at the new Duke Kunshan University, based in Kushan, China — a Chinese-American academic partnership between Duke University and Wuhan University established in 2013.
I am very excited by the opportunity and the impact I will be able to make in collaboration with many people at the lab
The appointment comes at a vital time for BNL, with preparations taking place for the Electron-Ion Collider, which expects first physics in the next decade. The unique facility will, for the first time, be able to systematically explore and map out the dynamical system that is the ordinary QCD bound state. On the appointment, Gao states: “The nuclear & particle physics directorate is well-known internationally in accelerator science, high-energy physics, and nuclear physics. I am very excited by the opportunity and the impact I will be able to make in collaboration with many people at the Lab.”
The ATLAS, CMS and LHCb collaborations perform precise measurements of Standard Model (SM) processes and direct searches for physics beyond the Standard Model (BSM) in a vast variety of channels. Despite the multitude of BSM scenarios tested this way by the experiments, it still constitutes only a small subset of the possible theories and parameter combinations to which the experiments are sensitive. The (re)interpretation of the LHC results in order to fully understand their implications for new physics has become a very active field, with close theory–experiment interaction and with new computational tools and related infrastructure being developed.
From 15 to 19 February, almost 300 theorists and experimental physicists gathered for a week-long online workshop to discuss the latest developments. The topics covered ranged from advances in public software packages for reinterpretation to the provision of detailed analysis information by the experiments, from phenomenological studies to global fits, and from long-term preservation to public data.
Open likelihoods
One of the leading questions throughout the workshop was that of public likelihoods. The statistical model of an experimental analysis provides its complete mathematical description; it is essential information for determining the compatibility of the observations with theoretical predictions. In his keynote talk “Open science needs open likelihoods’’, Harrison Prosper (Florida State University) explained why it is in our scientific interest to make the publication of full likelihoods routine and straightforward. The ATLAS collaboration has recently made an important step in this direction by releasing full likelihoods in a JSON format, which provides background estimates, changes under systematic variations, and observed data counts at the same fidelity as used in the experiment, as presented by Eric Schanet (LMU Munich). Matthew Feickert (University of Illinois) and colleagues gave a detailed tutorial on how to use these likelihoods with the pyhf python package. Two public reinterpretation tools, MadAnalysis5 presented by Jack Araz (IPPP Durham) and SModelS presented by Andre Lessa (UFABC Santo Andre) can already make use of pyhf and JSON likelihoods, and others are to follow. An alternative approach to the plain-text JSON serialisation is to encode the experimental likelihood functions in deep neural networks, as discussed by Andrea Coccaro (INFN Genova) who presented the DNNLikelihood framework. Several more contributions from CMS, LHCb and from theorists addressed the question of how to present and use likelihood information, and this will certainly stay an active topic at future workshops.
The question of making research data findable, accessible, interoperable and reusable is a burning one throughout modern science
A novelty for the Reinterpretation workshop was that the discussion was extended to experiences and best practices beyond the LHC, to see how experiments in other fields address the need for publicly released data and reusable results. This included presentations on dark-matter direct detection, the high-intensity frontier, and neutrino oscillation experiments. Supporting Prosper’s call for data reusability 40 years into the future – “for science 2061” – Eligio Lisi (INFN Bari) pointed out the challenges met in reinterpreting the 1998 Super-Kamiokande data, initially published in terms of the then-sufficient two-flavour neutrino-oscillation paradigm, in terms of contemporary three-neutrino descriptions, and beyond. On the astrophysics side, the LIGO and Virgo collaborations actively pursue an open-science programme. Here, Agata Trovato (APC Paris) presented the Gravitational Wave Open Science Center, giving details on the available data, on their format and on the tools to access them. An open-data policy also exists at the LHC, spearheaded by the CMS collaboration, and Edgar Carrera Jarrin (USF Quito) shared experiences from the first CMS open-data workshop.
The question of making research data findable, accessible, interoperable and reusable (“FAIR” in short) is a burning one throughout modern science. In a keynote talk, the head of the GO FAIR Foundation, Barend Mons, explained the FAIR Guiding Principles together with the technical and social aspects of FAIR data management and data reuse, using the example of COVID-19 disease modelling. There is much to be learned here for our field.
The wrap-up session revolved around the question of how to implement the recommendations of the Reinterpretation workshop in a more systematic way. An important aspect here is the proper recognition, within the collaborations as well as the community at large, of the additional work required to this end. More rigorous citation of HEPData entries by theorists may help in this regard. Moreover, a “Reinterpretation: Auxiliary Material Presentation” (RAMP) seminar series will be launched to give more visibility and explicit recognition to the efforts of preparing and providing extensive material for reinterpretation. The first RAMP meetings took place on 9 and 23 April.
The CMS collaboration, in partnership with the Geneva-based Sharing Knowledge Foundation, has launched a fundraising initiative to support the Lebanese scientific community during an especially difficult period. Lebanon signed an international cooperation agreement with CERN in 2016, which triggered a strong development of the country’s contributions to CERN projects, particularly to the CMS experiment through the affiliation of four of its top universities. Yet the country is dealing with an unprecedented economic crisis, food shortages, Syrian refugees and the COVID-19 pandemic, all in the aftermath of the Beirut port explosion in August 2020.
“Even the most resilient higher-education institutions in Lebanon are struggling to survive,” says CMS collaborator Martin Gastal of CERN, who initiated the fundraising activity in March. “Despite these challenges, the Lebanese scientific community has reaffirmed its commitment to CERN and CMS, but it needs support.”
One project, High-Performance Computing for Lebanon (HPC4L), which was initiated to build Lebanon’s research capacity while contributing as a Tier-2 centre to the analysis of CMS data, is particularly at risk. HPC4L was due to benefit from servers donated by CERN to Lebanon, and from the transfer of CERN and CMS knowledge and expertise to train a dedicated support team that will run a high-performance computing facility there. But the hardware has been unable to be shipped from CERN because of a lack of available funding. CMS and the Sharing Knowledge Foundation are therefore fundraising to cover the shipping costs of the donated hardware, to purchase hardware to allow its installation, and to support Lebanese experts while they are trained at CERN by the CMS offline computing team.
“At this pivotal moment, every effort to help Lebanon counts,” says Gastal. “CMS is reaching out for donations to support this initiative, to help both the Lebanese research community and the country itself.”
Recent decades have seen an emphasis on the market and social value of fundamental science. Increasingly, researchers must demonstrate the benefits of their work beyond the generation of pure scientific knowledge, and the cultural benefits of peaceful and open international collaboration.
This timely collection of short essays by leading scientific managers and policymakers, which emerged from a workshop held during Future Circular Collider (FCC) Week 2019, brings the interconnectedness of fundamental science and economics into focus. Its 18 contributions range from procurement to knowledge transfer, and from global-impact assessments to case studies from CERN, SKA, the ESS and ESA, with a foreword by former CERN Director-General Rolf Heuer. As such, it constitutes an important contribution to the literature and a guide for future projects such as a post-LHC collider.
As the number and size of research infrastructures (RIs) has grown over the years, describes CERN’s head of industry, procurement and knowledge transfer, Thierry Lagrange, the will to push the frontier of knowledge has required significant additional public spending linked to the development and upgrade of high-tech instruments, and increased maintenance costs. The socioeconomic returns to society are clear, he says. But these benefits are not generated automatically: they require a thriving ecosystem that transfers knowledge and technologies to society, aided by entities such as CERN’s knowledge transfer group and business incubation centres.
RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures
Multi-billion public investments in RIs are justified given their crucial and multifaceted role in society, asserts EIROforum liaison officer at the European Commission, Margarida Ribeiro. She argues that new RIs need to be closely integrated into the European landscape, with plans put in place for international governance structures, adequate long-term funding, closer engagement with industry, and methodologies for assessing RI impact. All contributors acknowledge the importance of this latter point. While physicists would no doubt prefer to go back to the pre-Cold War days of doing science for science’s sake, argues ESS director John Womersley, without the ability to articulate the socioeconomic justifications of fundamental science as a driver of prosperity, jobs, innovation, startups and as solutions to challenges such as climate change and the environment, it is only going to become more difficult for projects to get funding.
A future collider is a case in point. Johannes Gutleber of CERN and the FCC study describes several recent studies seeking to quantify the socioeconomic value of the LHC and its proposed successor, the FCC, with training and industrial innovation emerging as the most important generators of impact. The rising interest in the type of RI benefits that emerge and how they can be maximised and redistributed to society, he writes, is giving rise to a new field of interdisciplinary research, bringing together economists, social scientists, historians and philosophers of science, and policymakers.
Nowhere is this better illustrated than the ongoing programme led by economists at the University of Milan, described in two chapters by Florio Massimo and Andrea Bastianin. A recent social cost–benefit analysis of the HL-LHC, for example, conservatively estimates that every €1 of costs returns €1.2 to society, while a similar study concerning the FCC estimates the benefit/cost ratio to be even higher, at 1.8. Florio argues that CERN and big science more generally are ideal testing grounds for theoretical and empirical economic models, while demonstrating the positive net impact that large colliders have for society. His 2019 book Investing in Science: Social Cost-Benefit Analysis of Research Infrastructures (MIT Press) explores this point in depth (CERN Courier September 2018 p51), and is another must-read in this growing interdisciplinary area. Completing the series of essays on impact evaluation, Philip Amison of the UK’s Science and Technology Facilities Council reviews the findings of a report published last year capturing the benefits of CERN membership.
The final part of the volume focuses on the question “Who benefits from such large public investments in science?”, and addresses the contribution of big science to social justice and inequalities. Carsten Welsch of the University of Liverpool/Cockcroft Institute argues that fundamental science should not be considered as a distant activity, illustrating the point convincingly via the approximately 50,000 particle accelerators currently used in industry, medical treatments and research worldwide.
The grand ideas and open questions in particle physics and cosmology already inspire many young people to enter STEM subjects, while technological spin-offs such as medical treatments, big-data handling, and radio-frequency technology are also often communicated. Less well known are the significant but harder-to-quantify economic benefits of big science. This volume is therefore essential reading, not just for government ministers and policymakers, but for physicists and others working in curiosity-driven research who need to convey the immense benefits of their work beyond pure knowledge.
The eminent theoretical physicist Roger Julian Noel Phillips died peacefully on 4 September 2020, aged 89, at his home in Abingdon, UK. Roger was educated at Trinity College, Cambridge, where he received his PhD in 1955. His thesis advisor was Paul Dirac. Roger transferred from the Harwell theory group to the Rutherford Appleton Laboratory (RAL) in 1962 where he led the theoretical high-energy physics group to international prominence. He also held visiting appointments at CERN, Berkeley, Madison and Riverside.
Roger was a giant in particle physics phenomenology and his book “Collider Physics” (Addison-Wesley, 1987), co-authored with his longstanding collaborator Vernon Barger, remains a classic. In 1990 Roger was awarded the Ernest Rutherford Prize & medal of the UK Institute of Physics. To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built. To theorists, he was renowned for his deep understanding of particle-physics models. A career-long collaboration across the Atlantic with Barger ensued from their sharing an office at CERN in 1967. Their initial focus was the Regge-pole model to describe high-energy scattering of hadrons. Subsequently they inferred the momentum distribution of the light quarks and gluons from deep-inelastic scattering data and made studies to identify the charm-quark signal in a Fermilab neutrino experiment.
To experimenters, he was one of the rocks upon whom the UK high-energy physics community was built
In 1980, Phillips and collaborators discovered the resonance in neutrino oscillations when neutrinos propagate long distances through matter. This work is the basis of the ongoing Fermilab long-baseline neutrino program that will make precision determinations of neutrino masses and mixing. From 1983, Phillips and his collaborators developed pioneering strategies in collider physics for finding the W boson, the top quark, the Higgs boson and searches for physics beyond the Standard Model. In an influential 1990 publication, Phillips, Hewett and Barger showed that the decay of a b-quark to an s-quark and a photon is a highly sensitive probe of a charged Higgs boson through its one-loop virtual contribution.
After retiring in 1997, Roger maintained an active interest in particle physics. He struggled with Parkinson’s disease in recent years but continued to live with determination, wit and cheer. He joked that his Parkinson’s tremor made his mouse and keyboard run wild: “I know that an infinite number of random monkeys can eventually write Shakespeare, but I can’t wait that long!” One of his very last whispers to his son David was: “There are symmetries in mathematics which are like aspects of dreaming”. He did great things with his brain when he was alive that will continue as he donated his to the Parkinson’s UK Brain Bank.
Roger was highly respected for his intellectual brilliance, physics leadership and immense integrity, but also for his modesty and generosity in going out of his way to help others. He was a delight to work with and an inspiration to all who knew him. He is missed by his many friends around the world.
A greying giant of the field speaks to the blackboard for 45 minutes before turning, dismissively seizing paper and scissors, and cutting a straight slit. The sheet is twisted to represent the conical space–time described by the symbols on the board. A lecture theatre of students is transfixed in admiration.
This is not the teaching style advocated by José Mestre and Jennifer Docktor in their new book The Science of Learning Physics. And it’s no longer typical, say the authors, who suggest that approximately half of physics lecturers use at least one “evidence-based instructional practice” – jargon, most often, for an interactive teaching method. As colleagues joked when I questioned them on their teaching styles, there is still a performative aspect to lecturing, but these days it is just as likely to reflect the rock-star feeling of having a hundred camera phones pointed at you – albeit so the students can snap a QR code on your slide to take part in an interactive mid-lecture quiz.
Swiss and Soviet developmental psychologists Jean Piaget and Lev Vygotsky are duly namechecked
Mestre and Docktor, who are both educational psychologists with a background in physics, offer intriguing tips to maximise the impact of such practices. After answering a snap poll, they say, students should discuss with their neighbour before being polled again. The goal is not just to allow the lecturer to tailor their teaching, but also to allow students to “construct” their knowledge. Lecturing, they say, gives piecemeal information, but does not connect it. Neurons fire, but synaptic connections are not trained. And as the list of neurotransmitters that reinforce synaptic connections includes dopamine and serotonin, making students feel good by answering questions correctly may be worth the time investment.
Relative to other sciences, physics lecturers are leading the way in implementing evidence-based instructional practices, but far too few are well trained, say Mestre and Docktor, who want to bring the tools and educational philosophies of the high-school physics teacher to the lecture theatre. Swiss and Soviet developmental psychologists Jean Piaget and Lev Vygotsky are duly namechecked. “Think–pair–share”, mini whiteboards and flipping the classroom (not a discourteous gesture but the advance viewing of pre-recorded lectures before a more participatory lecture), are the order of the day. Students are not blank slates, they write, but have strong attachments to deeply ingrained and often erroneous intuitions that they have previously constructed. Misconceptions cannot be supplanted wholesale, but must be unknotted strand by strand. Lecturers should therefore explicitly describe their thought processes and encourage students to reflect on “metacognition”, or “thinking about thinking”. Here the text is reminiscent of Nobelist Daniel Kahneman’s seminal text Thinking, Fast and Slow, which divides thinking into two types: “system 1”, which is instinctive and emotional, and “system 2”, which is logical but effortful. Lecturers must fight against “knee-jerk” reasoning, say Mestre and Docktor, by modelling the time-intensive construction of knowledge, rather than aspiring to misleading virtuoso displays of mathematical prowess. Wherever possible, this should be directly assessed by giving marks not just for correct answers, but also for identifying the “big idea” and showing your working.
Disappointingly, examples are limited to pulleys and ramps, and, somewhat ironically, the book’s dusty academic tone may prove ineffective at teaching teachers to teach. But no other book comes close to The Science of Learning Physics as a means for lecturers to reflect on and enrich their teaching strategies, and it is highly recommend on that basis. That said, my respect for my old general-relativity lecturer remained undimmed as I finished the last page. Those old-fashioned lectures were hugely inspiring – a “non-cognitive aspect” that Mestre and Docktor admit their book does not consider.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.