Comsol -leaderboard other pages

Topics

Bernhard Spaan 1960–2021

Bernhard Spaan

Bernhard Spaan, an exceptional particle physicist and a wonderful colleague, unexpectedly passed away on 9 December, much too early at the age of 61.

Bernhard studied physics at the University of Dortmund, obtaining his diploma thesis in 1985 working on the ARGUS experiment at DESY’s electron–positron collider DORIS. Together with CLEO at Cornell, ARGUS was the first experiment dedicated to heavy-flavour physics, which became the central theme of Bernhard’s research work for the following 36 years. Progressing from ARGUS and CLEO to the higher-statistics experiments BaBar and ultimately LHCb, for which he made early contributions, he was one of the pioneering leaders in the next generation of heavy-flavour experiments at both electron–positron and hadron colliders.

While working on tau–lepton decays at ARGUS for his doctorate, Bernhard led a study of tau decays to five charged pions and a tau neutrino, which resulted in the world’s best upper limit for the tau-neutrino mass at the time. He also pioneered a new method of reconstructing the pseudo mass of the tau lepton by approximating the tau direction with the direction of the hadronic system. This method led to a new tau-lepton mass, which was an important ingredient to resolve the long-standing deviation from lepton universality as derived from the measurements of the tau lifetime, mass and leptonic branching fraction.

In 1993 Bernhard joined McGill University in Montreal, where he contributed to CLEO operation, data-taking and analysis, and was brought into contact with the formative stages of an asymmetric electron–positron B-factory at SLAC. He was an author of the BaBar letter of intent in 1994 and remained a leading member of the collaboration for the two following decades.

Bernhard saw the unique potential of a dedicated B experiment at the LHC and joined the LHCb collaboration

In 1996 Bernhard started a professorship at Dresden where, together with Klaus Schubert, he built a strong German BaBar participation including involvement in the construction and operation of the calorimeter. At that time, BaBar was pioneering the use of distributed computing resources for data-processing. As one of the proponents of this approach, Bernhard played a crucial role in the German contribution via the computing centre at Karlsruhe, later “GridKa”. Building on the success of the electron–positron B-factories, Bernhard saw the unique potential of a dedicated B experiment at the LHC and joined the LHCb collaboration in 1998.

Bernhard’s scientific journey came full circle when he accepted a professorship at Dortmund University in 2004, which he used to significantly grow his LHCb participation. The Dortmund group is one of LHCb’s largest, with a long list of graduate students and main research topics including the determination of the CKM angles β and γ governing CP violation in rare B decays. In parallel with LHC Run 1 and 2 data-taking, Bernhard investigated the possibility of using scintillating fibres for a novel tracking detector capable of operating at much larger luminosities. In all phases of the “SciFi” detector, which was recently installed ahead of LHC Run 3, he supported the project with his ideas, his energy and the commitment of his group.

Bernhard was an outstanding experimental physicist whose many contributions shaped the field of experimental heavy-flavour physics. He was also a great communicator. His ability to resolve conflicts and to find compromises brought many additional tasks to Bernhard, whether as dean of the Dortmund faculty, chair of the national committee for particle physics, member of R-ECFA or chair of the LHCb collaboration board. When help was needed, Bernhard never said “no”.

We have lost a tremendous colleague and a dear friend who will be sorely missed not only by us, but the wider field.

One day in September: Copenhagen

The ghosts of Niels Bohr, Werner Heisenberg and Margrethe Bohr

“But why?” asks Margrethe Bohr. Her husband, Niels, replies “Does it matter my love now that we’re all three of us dead and gone?” Alongside Werner Heisenberg, the trio look like spirits meeting in an atemporal dimension, maybe the afterlife, under an eerie ring of light. Dominating an almost empty stage, they try to revive what happened on one day in September 1941, when Heisenberg, a prominent figure in Hitler’s Uranverein (Uranium Club), travelled to Nazi-occupied Denmark to visit his former mentor, Niels Bohr. 

Why did Heisenberg go to meet Bohr that day? Did he seek an agreement not to develop the bomb in Germany? Was he searching for intelligence on Allied progress? To convince Bohr that there was no German programme? Or to pick Bohr’s brain on atomic physics? Or, according to Margrethe, to show off? Perhaps his motives were a superposition of all of these. No one knows what was said. This puzzle has intrigued historians ever since. 

Eighty years after that meeting, and 23 since Michael Frayn’s masterwork Copenhagen premiered at the National Theatre in London, award-winning director Polly Findlay and Emma Howlett in her professional directorial debut have revived a play that contains little action but much physics and food for thought.

The three actors orbit like electrons in an atom

Frayn’s nonlinear script is based on three possible versions of the same meeting in Copenhagen in 1941, which can be construed as three different scenarios playing out in the many-worlds interpretation of quantum mechanics. He describes it as the process of rewriting a draft of a paper again and again, trying to unlock more secrets. In the afterlife, the trio’s dialogue jumps back and forth in time, adding confusing memories and contradicting hypotheses. Delivered at pace, the narrative explores historical information and their personal stories.

The three characters reflect on how German scientists failed to build the bomb, even though they had the best start; Otto Hahn, Lise Meitner and Fritz Strassmann having discovered nuclear fission in 1939. But Frayn highlights how Hitler’s Deutsche Physik was hostile to so-called Jewish physics and key Jewish physicists, including Bohr, who later fled to Los Alamos in the US. Frayn’s Heisenberg reveals the disbelief he felt when he learnt about the destruction of Hiroshima on the radio. At the time he was detained in Farm Hall, not far from this theatre in Cambridge in the UK, together with other members of the Uranium Club. Called Operation Epsilon, the bugged hall was used by the Allied forces to try to uncover the state of Nazi scientific progress.

The three actors orbit like electrons in an atom, while the theatre’s revolving stage itself spins. Superb acting by Philip Arditti and Malcolm Sinclair elucidates an extraordinary student–mentor relationship between Heisenberg and Bohr. The sceptical Mrs Bohr (Haydn Gwynne) steers the conversation and questions their friendship, cajoling Bohr to speak in plain language. Nevertheless, the use of scientific jargon could leave some non-experts in the audience behind. 

Although Heisenberg wrote in his autobiography that “it would be better to stop disturbing the spirits of the past,” the private conversation between the two physicists has stirred the interest of the public, journalists and historians for years. In 1956 the journalist Robert Jungk wrote in his debated book, Brighter than a Thousand Suns, that Heisenberg wanted to prevent the development of an atomic bomb. This book was also an inspiration for Frayn’s play. More recently, in 2001, Bohr’s family released some letters that Bohr wrote and never sent to Heisenberg. According to these letters, Bohr was convinced that Heisenberg was building the bomb in Germany.

To this day, the reason for Heisenberg’s visit to Copenhagen remains uncertain, or unknowable, like the properties of a quantum particle that’s not observed. The audience can only imagine what really happened, while considering all philosophical interpretations of the fragility of the human species. 

Witten reflects

Edward Witten

How has the discovery of a Standard Model-like Higgs boson changed your view of nature? 

The discovery of a Standard Model-like Higgs boson was a great triumph for renormalisable field theory, and really for simplicity. By the time the LHC was operating, attempts to make the Standard Model (SM) work without an elementary Higgs field – using a dynamical mechanism instead – had become rather convoluted. It turned out that, as far as one can judge from what we have learned so far, the original idea of an elementary Higgs particle was correct. This also means that nature takes advantage of all the possible building blocks of renormalisable field theory – fields of spin 0, 1/2 and 1 – and the flexibility that that allows. 

The other key fact is that the Higgs particle has appeared by itself, and without any sign of a mechanism that would account for the smallness of the energy scale of weak interactions compared to the much larger presumed energy scales of gravity, grand unification and cosmic inflation. From the perspective that my generation of particle physicists grew up with (and not only my generation, I would say), this is quite a shock. Of course, we lived through a somewhat similar shock a little over 20 years ago with the discovery that the expansion of the universe is accelerating – something that is most simply interpreted in terms of a very small but positive cosmological constant, the energy density of the vacuum. It seems that the ideas of naturalness that we grew up with are failing us in at least these two cases.

What about new approaches to the fine-tuning problem such as the relaxion or “Nnaturalness”?

Unfortunately, it has been very hard to find a conventional natural explanation of the dark energy and hierarchy problems. Reluctantly, I think we have to take seriously the anthropic alternative, according to which we live in a universe that has a “landscape”of possibilities, which are realised in different regions of space or maybe in different portions of the quantum mechanical wavefunction, and we inevitably live where we can. I have no idea if this interpretation is correct, but it provides a yardstick against which to measure other proposals. Twenty years ago, I used to find the anthropic interpretation of the universe upsetting, in part because of the difficulty it might present in understanding physics. Over the years I have mellowed. I suppose I reluctantly came to accept that the universe was not created for our convenience in understanding it.

Which experimental paths should physicists prioritise at this time?

It is extremely important to probe the twin mysteries of the cosmic acceleration and the smallness of the electroweak scale as thoroughly as possible, in order to determine whether we are interpreting the facts correctly and possibly to discover a new layer of structure. In the case of the cosmic acceleration, this means measuring as precisely as we can the parameter w (the ratio of pressure and energy), which equals –1 if the acceleration of the expansion is governed by a simple cosmological constant, but would be greater than –1 in most alternative models. In particle physics, we would like to probe for further structure as precisely as we can both indirectly, for example with precision studies of the Higgs particle, and hopefully directly by going to higher energies than are available at the LHC.

What might be lurking at energies beyond the LHC?

If it is eventually possible to go to higher energies, I can imagine several possible outcomes. It might become rather clear that the traditional idea of naturalness is not the whole story and that we have on our hands a “bare” Higgs particle, without a mechanism that would account for its mass scale. Alternatively, we might find out that the apparent failure of naturalness was an illusion and that additional particles and forces that provide an explanation for the electroweak scale are just beyond our current experimental reach. There is also an intermediate possibility that I find fascinating. This is that the electroweak scale is not natural in the customary sense, but additional particles and forces that would help us understand what is going on exist at an energy not too much above LHC energies. A fascinating theory of this type is the “split supersymmetry” that has been proposed by Nima Arkani-Hamed and others.  

It seems that the ideas of naturalness that we grew up with are now failing us 

There is an obvious catch, however. It is easy enough to say “such-and-such will happen at an energy not too much above LHC energies”. But for practical purposes, it makes a world of difference whether this means three times LHC energies, six times LHC energies, 25 times LHC energies, or more. In theories such as split supersymmetry, the clues that we have are not sufficient to enable a real answer. A dream would be to get a concrete clue from experiment about what is the energy scale for new physics beyond the Higgs particle. 

Could the flavour anomalies be one such clue?

There are multiple places that new clues could come from. The possible anomalies in b physics observed at CERN are extremely significant if they hold up. The search for an electric dipole moment of the electron or neutron is also very important and could possibly give a signal of something new happening at energies close to those that we have already probed. Another possibility is the slight reported discrepancy between the magnetic moment of the muon and the SM prediction. Here, I think it is very important to improve the lattice gauge theory estimates of the hadronic contribution to the muon moment, in order to clarify whether the fantastically precise measurements that are now available are really in disagreement with the SM. Of course, there are multiple other places that experiment could pinpoint the next energy scale at which the SM needs to be revised, ranging from precision studies of the Higgs particle to searches for muon decay modes that are absent in the SM. 

Which current developments in theory are you most excited about?

The new ideas about gravity and quantum mechanics that go under the rough title “It from qubit” are really exciting. Black-hole thermodynamics was discovered in the 1970s through the work of Jacob Bekenstein, Stephen Hawking and others. These results were fascinating, but for several decades it seemed to me – rightly or wrongly – that this field was evolving only slowly compared to other areas of theoretical physics. In the past decade or so, that is clearly no longer the case. In large part the change has come from thinking about “entropy” as microscopic or fine-grained von Neumann entropy, as opposed to the thermodynamic entropy that Bekenstein and others considered. A formulation in terms of fine-grained entropy has made possible new statements and more general statements which reduce to the traditional ones when thermodynamics is valid. All this has been accelerated by the insights that come from holographic duality between gravity and gauge theory.

How different does the field look today compared to when you entered it?

It is really hard to exaggerate how the field has changed. I started graduate school at Princeton in September 1973. Asymptotic freedom of non-abelian gauge theory had just been discovered a few months earlier by David Gross, Frank Wilzcek and David Politzer. This was the last key ingredient that was needed to make possible the SM as we know it today. Since then there has been a revolution in our experimental knowledge of the SM. Several key ingredients (new quarks, leptons and the Higgs particle) were unknown in 1973. Jets in hadronic processes were still in the future, even as an idea, let alone an experimental reality, and almost nothing was known about CP violation or about scaling violations in high-energy hadronic processes, just to mention two areas that developed later in an impressive way. 

6D Calabi–Yau manifolds

Not only is our experimental knowledge of the SM so much richer than it was in 1973, but the same is really true of our theoretical understanding as well. Quantum field theory is understood much better today than was the case in 1973. There really is no comparison.

Perhaps equally dramatic has been the change in our understanding of cosmology. In 1973, the state of cosmological knowledge could be summarised fairly well in a couple of numbers – notably the cosmic-microwave temperature and the Hubble constant – and of these only the first was measured with any reasonable precision. In the intervening years, cosmology became a precision science and also a much more ambitious science, as cosmologists have learned to grapple with the complex processes of the formation of structure in the universe. In the inhomogeneities of the microwave background, we have observed what appear to be the seeds of structure formation. And the theory of cosmic inflation, which developed starting around 1980, seems to be a real advance over the framework in which cosmology was understood in 1973, though it is certainly still incomplete.

Exploring the string-theory framework has led to a remarkable series of discoveries

Finally, 50 years ago the gulf between particle physics and gravity seemed unbridgeably wide. There is still a wide gap today. But the emergence in string theory of a sensible framework to study gravity unified with particle forces has changed the picture. This framework has turned out to be very powerful, even if one is not motivated by gravity and one is just searching for new understanding of ordinary quantum field theory. We do not understand today in detail how to unify the forces and obtain the particles and interactions that we see in the real world. But we certainly do have a general idea of how it can work, and this is quite a change from where we were in 1973. Exploring the string-theory framework has led to a remarkable series of discoveries. This well has not run dry, and that is one of the reasons that I am optimistic about the future.

Which of the numerous contributions you have made to particle and mathematical physics are you most proud of?

I am most satisfied with the work that I did in 1994 with Nathan Seiberg on electric-magnetic duality in quantum field theory, and also the work that I did the following year in helping to develop an analogous picture for string theory.

Who knows, maybe I will have the good fortune to do something equally significant again in the future.

To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...

Multidisciplinary CERN forum tackles AI

Anima Anandkumar

The inaugural Sparks! Serendipity Forum attracted 49 leading computer scientists, policymakers and related experts to CERN from 17 to 18 September for a multidisciplinary science-innovation forum. In this first edition, participants discussed a range of ethical and technical issues related to artificial intelligence (AI), which has deep and developing importance for high-energy physics and its societal applications. The structure of the discussions was designed to stimulate interactions between AI specialists, scientists, philosophers, ethicists and other professionals with an interest in the subject, leading to new insights, dialogue and collaboration between participants.

World-leading cognitive psychologist Daniel Kahneman opened the public part of the event by discussing errors in human decision making, and their impact on AI. He explained that human decision making will always have bias, and therefore be “noisy” in his definition, and asked whether AI could be the solution, pointing out that AI algorithms might not be able to cope with the complexity of decisions that humans have to make. Others speculated as to whether AI could ever achieve the reproducibility of human cognition – and if the focus should shift from searching for a “missing link” to considering how AI research is actually conducted by making the process more regulated and transparent.

Introspective AI

Participants discussed both the advantages and challenges associated with designing introspective AI, which is capable of examining its own processes and could be beneficial in making predictions about the future. Participants also questioned, however, whether we should be trying to make AI more self-aware and human-like. Neuroscientist Ed Boyden explored introspection through the lens of neural pathways, and asked whether we can design introspective AI before we understand introspection in brains. Following the introspection theme, philosopher Luisa Damiano addressed the reality versus fiction of “social-embodied” AI – the idea of robots interacting with us in our physical world – arguing that such a possibility would require careful ethical considerations. 

AI is already a powerful, and growing, tool for particle physics

Many participants advocated developing so-called “strong” AI technology that can solve problems it has not come across before, in line with specific and targeted goals. Computer scientist Max Welling explored the potential for AI to exceed human intelligence, and suggested  that AI can potentially be as creative as humans, although further research is required. 

On the subject of ethics, Anja Kaspersen (former director of the UN Office for Disarmament Affairs) asked: who makes the rules? Linking to military, humanitarian and technological affairs, she considered how our experience in dealing with nuclear weapons could help us deal with the development of AI. She said that AI is prone to ethics washing: the process of creating an illusory sense that ethical issues are being appropriately addressed when they are not. Participants agreed that we should seek to avoid polarising the community when considering risks associated with current and future AI, and suggested a more open approach to deal with the challenges faced by AI today and tomorrow. Skype co-founder Jann Tallin identified AI as one of the most worrying existential risks facing society today; the fact that machines do not consider whether their decisions are unethical demands that we consider the constraints of the AI design space within the realm of decision making. 

Fruits of labour

The initial outcomes of the Sparks! Serendipity Forum are being written up as a CERN Yellow Report, and at least one paper will be submitted to the journal Machine Learning Science and Technology. Time will tell what other fruits of the serendipitous interactions at Sparks! will bring. One thing is certain, however, AI is already a powerful, and growing, tool for particle physics. Without it, the LHC experiments’ analyses would have been much more tortuous, as discussed by Jennifer Ngadiuba and Maurizio Pierini (CERN Courier September/October 2021 p31)

Future editions of the Sparks! Seren­dipity Forum will tackle different themes in science and innovation that are relevant to CERN’s research. The 2022 event will be built around future health technologies, including the many accelerator, detector and simulation technologies that are offshoots of high-energy-physics research. 

Training future experts in the fight against cancer

The leading role of CERN in fundamental research is complemented by its contribution to applications for the benefit of society. A strong example is the Heavy Ion Therapy Masterclass (HITM) school, which took place from 17 to 21 May 2021. Attracting more than 1000 participants from around the world, many of whom were young students and early-stage researchers, the school demonstrated the enormous potential to train the next generation of experts in this vital application. It was the first event of the European Union project HITRIplus (Heavy Ion Therapy Research Integration), in which CERN is a strategic partner along with other research infrastructures, universities, industry partners, the four European heavy-ion therapy centres and the South East European International Institute for Sustainable Technologies (SEEIIST). As part of a broader “hands-on training” project supported by the CERN and Society Foundation with emphasis on capacity building in Southeast Europe, the event was originally planned to be hosted in Sarajevo but was held online due to the pandemic. 

The school’s scientific programme highlighted the importance of developments in fundamental research for cancer diagnostics and treatment. Focusing on treatment planning, it covered everything needed to deliver a beam to a tumour target, including the biological response of cancerous and healthy tissues. The Next Ion Medical Machine Study (NIMMS) group delivered many presentations from experts and young researchers, starting from basic concepts to discussions of open points and plans for upgrades. Expert-guided practical sessions were based on the matRad open-source professional toolkit, developed by the German cancer research centre DKFZ for training and research. Several elements of the course were inspired by the International Particle Therapy Masterclasses.  

Virtual visits to European heavy-ion therapy centres and research infrastructures were ranked by participants among the most exciting components of the course. There were also plenty of opportunities for participants to interact with experts in dedicated sessions, including a popular session on entrepreneurship by the CERN Knowledge Transfer group. This interactive approach had a big impact on participants, several of which were motivated to pursue careers in related fields and to get actively involved at their home institutes. This future expert workforce will become the backbone for building and operating future heavy-ion therapy and research facilities that are needed to fight cancer worldwide (see Linacs to narrow radiotherapy gap).

Further support is planned at upcoming HITRIplus schools on clinical and medical aspects, as well as HITRIplus internships, to optimally access existing European heavy-ion therapy centres and contribute to relevant research projects. 

A systematic approach to systematics

Whenever we perform an analysis of our data, whether measuring a physical quantity of interest or testing some hypothesis, it is necessary to assess the accuracy of our result. Statistical uncertainties arise from the limited accuracy with which we can measure anything, or from the natural Poisson fluctuations involved in counting independent events. They have the property that repeated measurements result in greater accuracy.

Systematic uncertainties, on the other hand, arise from many sources and may not cause a spread in results when experiments are repeated, but merely shift them away from the true value. Accumulating more data usually does not reduce the magnitude of a systematic effect. As a result, estimating systematic uncertainties typically requires much more effort than for statistical ones, and more personal judgement and skill is involved. Furthermore, statistical uncertainties between different analyses usually are independent; this often is not so for systematics.

The November event saw the largest number of statisticians at any PHYSTAT meeting

In particle-physics analyses, many systematics are related to detector and analysis effects. Examples include trigger efficiency; jet energy scale and resolution; identification of different particle types; and the strength of backgrounds and their distributions. There are also theoretical uncertainties which, as well as affecting predicted values for comparison with measured ones, can also influence the experimental variables extracted from the data. Another systematic comes from the intensity of accelerator beams (the integrated luminosity at the LHC for example), which is likely to be correlated for the various measurements made using the same beams.

At the LHC, it is in analyses with large amounts of data where systematics are likely to be most relevant. For example, a measurement of the mass of the W boson published by the ATLAS collaboration in 2018, based on a sample of 14 million W-boson decays, had a statistical uncertainty of 7 MeV but a systematic uncertainty of 18 MeV.

PHYSTAT-Systematics

Two big issues for systematics are how the magnitudes of the different sources are estimated, and how they are then incorporated in the analysis. The PHYSTAT-Systematics meeting concentrated on the latter, as it was thought that this was more likely to benefit from the presence of statisticians – a powerful feature of the PHYSTAT series, which started at CERN in 2000.

The 20 talks fell into three categories. The first were those devoted to analyses in different particle-physics areas: the LHC experiments; neutrino-oscillation experiments; dark-matter searches; and flavour physics. A large amount of relevant information was discussed, with interesting differences in the separate sub-fields of particle physics. For example, in dark-matter searches, upper limits sometimes are set using Yellin’s Maximum Gap method when the expected background is low, or by using Power Constrained Limits, whereas these tend not to be used in other contexts.

The second group followed themes: theoretical systematics; unfolding; mis-modelling; an appeal for experiments to publish their likelihood functions; and some of the many aspects that arise in using machine learning (where the machine-learning process itself can result in a systematic, and the increased precision of a result should not be at the expense of accuracy).

Finally, there was a series of talks and responses by statisticians. The November event saw the largest number of statisticians at any PHYSTAT meeting, and the efforts that they made to understand our intricate analyses and the statistical procedures that we use were much appreciated. It was valuable to have insights from a different viewpoint on the largely experimental talks. David van Dyk, for instance, emphasised the conceptual and practical differences between simply using the result of a subsidiary experiment’s estimate of a systematic to assess its effect on a result, and using the combined likelihood function for the main and the subsidiary measurements. Also, in response to talks about flavour physics and neutrino-oscillation experiments, attention was drawn to the growing impact in cosmology of non-parametric, likelihood-free (simulation-based likelihoods) and Bayesian methods. Likelihood-free methods came up again in response to a modelling talk based on LHC-experiment analyses, and the role of risk estimation was emphasised by statisticians. Such suggestions for alternative statistical strategies open the door to further discussions about the merits of new ideas in particular contexts.

A novel feature of this remote meeting was that the summary talks were held a week later, to give speakers Nick Wardle and Sara Algeri more time. In her presentation, Algeri, a statistician, called for improved interaction between physicists and statisticians in dealing with these interesting issues.

Overall, the meeting was a good step on the path towards having a systematic approach to systematics. Systematics is an immense topic, and it was clear that one meeting spread over four afternoons was not going to solve all the issues. Ongoing PHYSTAT activities are therefore planned, and the organisers welcome further suggestions.

2021 IOP Awards

The UK Institute of Physics has announced its 2021 awards, recognising several high-energy and nuclear physicists across three categories.

David Deutsch and Ian Chapman-2

In the gold-medal category David Deutsch of the University of Oxford has been awarded the Isaac Newton Prize “for founding the discipline named quantum computation and establishing quantum computation’s fundamental idea, now known as the ‘qubit’ or quantum bit.” In the same category, Ian Chapman received the Richard Glazebrook Prize “for outstanding leadership of the UK Atomic Energy Authority and the world’s foremost fusion research and technology facility, the Joint European Torus, and the progress it has delivered in plasma physics, deuterium-tritium experiments, robotics, and new materials”.

Silver medal

Among this year’s silver-medal recipients, experimentalist Mark Lancaster of the University of Manchester earned the James Chadwick Prize “for distinguished, precise measurements in particle physics, particularly of the W boson mass and the muon’s anomalous magnetic moment”. Michael Bentley (University of York) received  the Ernest Rutherford Prize for his contributions to the understanding of fundamental symmetries in atomic nuclei, while Jerome Gauntlett (Imperial College London) received the John William Strutt Lord Rayleigh Prize for applications of string theory to quantum field theory, black holes, condensed matter physics and geometry.

Bronze medals collage-2

Finally, in the bronze medal category for early-career researchers, the Daphne Jackson Prize for exceptional contributions to physics education goes to accelerator physicist Chris Edmons (University of Liverpool) in recognition of his work in improving access for the visually impaired, for example via the Tactile Collider project. And the Mary Somerville Prize for exceptional contributions to public engagement in physics goes to XinRan Liu (University of Edinburgh) for his promotion of UK research and innovation to both national and international audiences.

Lyn Evans and Tim Palmer

Acknowledging physicists who have contributed to the field generally, 2021 honorary Institute of Physics fellowships were granted to Lyn Evans (for sustained and distinguished contributions to, and leadership in, the design, construction and operation of particle accelerator systems, and in particular the LHC) and climate physicist Tim Palmer, a proponent of building a ‘CERN for climate change’, for his pioneering work exploring the nonlinear dynamics and predictability of the climate system.

Beate Heinemann appointed director at DESY

Beate Heinemann

Experimental particle physicist Beate Heinemann has been announced as the new director of DESY’s High Energy Physics division, effective from 1 February. Succeeding interim director Ties Behnke, who held the position since January 2021 when Joachim Mnich joined CERN as director for research and computing, she is the first female director in DESY’s 60-year history.

After completing a PhD at the University of Hamburg in 1999, based on data from the H1 experiment at DESY’s former electron-proton collider HERA, Heinemann did a postdoc at the University of Liverpool, UK, working on the CDF experiment at Fermilab.  She became a lecturer at Liverpool in 2003, a professor at UC Berkeley in 2006 and a scientist at Lawrence Berkeley National Laboratory.

In 2007 Heinemann joined the ATLAS collaboration in which she helped with the installation, commissioning and data-quality assessment of the pixel detector as well as performing other roles including as data-preparation coordinator during the LHC startup phase. She was deputy spokesperson of the ATLAS collaboration from 2013 to 2017, and since 2016 has been a senior scientist at DESY and W3 professor at Albert-Ludwigs-Universität Freiburg. She was also a member of the Physics Preparatory Group during the 2020 update of the European strategy for particle physics, and since 2017 she has been a member of the CERN Scientific Policy Committee.

Born in Hamburg, Heinemann is looking forward to the many exciting challenges, both scientifically and socially, ahead: “It is very important that we retain and further expand our pioneering role as a centre for fundamental research for the study of matter. In the next few years, the course will be set for the successor project to the LHC, whose technology and location have not yet been chosen. DESY must be actively involved in the preparation of this project in order to maintain and expand its pioneering role,” she explains. “Another topic that is very close to my heart, both personally and through my new office, is diversity. DESY should remain a cosmopolitan, diverse laboratory, and there is still room for improvement in many areas, for example the number of women in management positions.”

Hadron colliders in perspective

From visionary engineer Rolf Widerøe’s 1943 patent for colliding beams, to the high-luminosity LHC and its possible successor, the 14 October symposium “50 Years of Hadron Colliders at CERN” offered a feast of physics and history to mark the 50th anniversary of the Intersecting Storage Rings (ISR). Negotiating the ISR’s steep learning curve in the 1970s, the ingenious conversion of the Super Proton Synchrotron (SPS) into a proton–antiproton collider (SppS) in the 1980s, and the dramatic approval and switch-on of the LHC in the 1990s and 2000s chart a scientific and technological adventure story, told by its central characters in CERN’s main auditorium.

Former CERN Director-General (DG) Chris Llewellyn Smith swiftly did away with notions that the ISR was built without a physics goal. Viki Weisskopf (DG at the time) was well aware of the quark model, he said, and urged that the ISR be built to discover quarks. “The basic structure of high-energy collisions was discovered at the ISR, but you don’t get credit for it because it is so obvious now,” said Llewellyn Smith. Summarising the ISR physics programme, Ugo Amaldi, former DELPHI spokesperson and a pioneer of accelerators for hadron therapy, listed the observation of charmed-hadron production in hadronic interactions, studies of the Drell–Yan process, and measurements of the proton structure function as ISR highlights. He also recalled the frustration at CERN in late 1974 when the J/ψ meson was discovered at Brookhaven and SLAC, remarking that history would have changed dramatically had the ISR detectors also enabled coverage at high transverse momentum.

A beautiful machine

Amaldi sketched the ISR’s story in three chapters: a brilliant start followed by a somewhat difficult time, then a very active and interesting programme. Former CERN director for accelerators and technology Steve Myers offered a first-hand account, packed with original hand-drawn plots, of the battles faced and the huge amount learned in getting the first hadron collider up and running. “The ISR was a beautiful machine for accelerator physics, but sadly is forgotten in particle physics,” he said. “One of the reasons is that we didn’t have beam diagnostics, on account of the beam being a coasting beam rather than a bunched beam, which made it really hard to control things during physics operation.” Stochastic cooling, a “huge surprise”, was the ISR’s most important legacy, he said, paving the way for the SppS and beyond.

Former LHC project director Lyn Evans took the baton, describing how the confluence of electroweak theory, the SPS as collider and stochastic cooling led to rapid progress. It started with the Initial Cooling Experiment in 1977–1978, then the Antiproton Accumulator. It would take about 20 hours to produce a bunch dense enough for injection into the SppS , recalled Evans, and several other tricks to battle past the “26 GeV transition, where lots of horrible things” happened. At 04:15 on 10 July 1981, with just him and Carlo Rubbia in the control room, first collisions at 270 GeV at the SppS were declared.

Poignantly, Evans ended his presentation “The SPS and LHC machines” there. “The LHC speaks for itself really,” he said. “It is a fantastic machine. The road to it has been a long and very bumpy one. It took 18 years before the approval of the LHC and the discovery of the Higgs. But we got there in the end.”

Discovery machines

The parallel world of hadron-collider experiments was brought to life by Felicitas Pauss, former CERN head of international relations, who recounted her time as a member of the UA1 collaboration at the SppS during the thrilling period of the W and Z discoveries. Jumping to the present day, early-career researchers from the ALICE, ATLAS, CMS and LHCb collaborations brought participants up to date with the progress at the LHC in testing the Standard Model and the rich physics prospects at Run 3 and the HL-LHC.

Few presentations at the symposium did not mention Carlo Rubbia, who instigated the conversion of the SPS into a hadron collider and was the prime mover of the LHC, particularly, noted Evans, during the period when the US Superconducting Super Collider was under construction. His opening talk presented a commanding overview of colliders, their many associated Nobel prizes and their applications in wider society.

During a brief Q&A at the end of his talk, Rubbia reiterated his support for a muon collider operating as a Higgs factory in the LHC tunnel: “The amount of construction is small, the resources are reasonable, and in my view it is the next thing we should do, as quickly as possible, in order to make sure that the Higgs is really what we think it is.”

It seems in hindsight that the LHC was inevitable, but it was anything but

Christopher Llewellyn Smith

In a lively and candid presentation about how the LHC got approved, Llewellyn Smith also addressed the question of the next collider, noting it will require the unanimous support of the global particle-physics community, a “reasonable” budget envelope and public support. “It seems in hindsight that the LHC was inevitable, but it was anything but,” he said. “I think going to the highest energy is the right way forward for CERN, but no government is going to fund a mega project to reduce error bars – we need to define the physics case.”

Following a whirlwind “view from the US”, in which Young-Kee Kim of the University of Chicago described the Tevatron and RHIC programmes and collated congratulatory messages from the US Department of Energy and others, CERN DG Fabiola Gianotti rounded off proceedings with a look at the future of the LHC and beyond. She updated participants on the significant upgrade work taking place for the HL-LHC and on the status of the Future Circular Collider feasibility study, a high-priority recommendation of the 2020 update of the European strategy for particle physics which is due to be completed in 2025. “The extraordinary success of the LHC is the result of the vision, creativity and perseverance of the worldwide high-energy physics community and more than 30 years of hard work,” the DG stated. “Such a success demonstrates the strength of the community and it’s a necessary milestone for future, even more ambitious, projects.”

Videos from the one-off symposium, capturing the rich interactions between the people who made hadron colliders a reality, are available here.

Harnessing the LHC network

Harnessing the LHC network

On 15 November, around 260 physicists gathered at CERN (90 in person) to participate in the 2021 LHC Career Networking event, which is aimed at physicists, engineers and others who are considering leaving academia for a career in industry, non-governmental organisations and government. It was the fifth event in a series that was initially limited to attendance only by members of LHC experiments but which, in light of its strong resonance within the community, is now open to all.

Former members of the LHC experiments were invited to share their experiences of working in fields ranging from project management at the Ellen MacArthur Foundation, to consultants like McKinsey and pharmaceutical companies such as Boehringer Ingelheim. They spoke movingly of the difficulties of leaving academia and research, the introspection they experienced to discover the path that was right for them, and the sense of satisfaction and happiness they felt in their new roles.

Adjusting to new environments

Following a supportive welcome from Joachim Mnich, CERN director of research and computing, and Marianna Mazzilli, a member of the ALICE collaboration and chair of the organising committee, the first speaker to take to the stage in the main auditorium was Florian Kruse. Florian was a physicist on the LHCb experiment who, upon leaving CERN, decided to set up his own data-science and AI company called Point 8 – a throwback from many years spent commuting to the LHCb pit at LHC Point 8. His company has grown from three to 20 staff members, some ex-CERN, and continues to expand.

Setting the tone for the evening, he talked about what to expect when interacting with industry, how people view CERN physicists and where and how adjustments have to be made to adapt to a new environment – advising participants to “recalibrate your imposter syndrome” and “adjust to other audiences”.

Julia Hunt, a former CMS experimentalist, shared a personal insight into her journey out of academia, revealing that she fortuitously came across sailor Ellen MacArthur’s TED talk and soon landed the job of project manager at the Ellen MacArthur Foundation.

The field of data science has welcomed numerous former CERN physicists, among them ex-ATLAS members Max Baak and Till Eifert, former CMS and ALICE member Torsten Dahms, ex-CMS member Iasonas Topsis-Giotis and ex-ALICE member Elena Bruna. Max gave a mini-course in bond trading at ING bank, while Iasonas put a positive spin on his long search for a job by saying that each interview or application taught him essential lessons for the next application, eventually landing him a job as a manager at professional services company Ernst & Young in Belgium. In a talk titled “19 years in physics… and then?”, Torsten shared the sleepless nights he endured when deliberating whether to continue in a field that had him relocate himself and his family five times in 15 years, ultimately turning down a tenure-track position in 2019.

Elena, who despite having a permanent position left the field in 2018 to become a data scientist at Boehringer Ingelheim, highlighted the differences between physics (where data structures are usually designed in advance and data are largely available) and data science (where the value of data is not always known a priori, and tends to be more messy), and indicated areas to highlight on a data-science CV. These include keeping it to a maximum of two pages and emphasising skills and tools, including big-data analysis, machine-learning techniques, Monte Carlo simulations and working in international teams. The topic of CVs came up repeatedly, a key message being that physicists must modify the language used in academic applications because people “outside” just don’t understand our terminology.

Two networking breaks, held in person and accompanied by beer, wine and pizza for those who were present and via Zoom breakout rooms for remote participants, were alive with questions and discussion.  Former ATLAS member Till Eifert was surrounded by physicists eager to learn more about his role as a specialist consultant with McKinsey in Geneva, speaking passionately about the renewable energy, cancer diagnostics and decarbonisation projects he has worked on. Head of CERN Alumni relations Rachel Bray and her team were on hand to answer a multitude of questions about the CERN Alumni programme.

70-85% of jobs come through networking

Anthony Nardini

Emphasising the power of such events, speaker Anthony Nardini from entrepreneurial company On Deck cited a 2017 Payscale survey which found that 70–85% of jobs come through networking. Following up from the event on Twitter, he offered takeaways for all career “pivoters”: craft and prioritise your guiding principles, such as industry, job function, company stage mission; create a daily information-gathering practice so that you are reading the same newsletters, articles and Twitter feeds as those in your target roles; identify and contact “pathblazers” in your target organisations who understand your background; and do the work to pitch how your unique skillset can help a startup to grow.

All the speakers gave their time and contact details for follow-up questions and advice. The overall message was that, while the transition out of academia can be hard, CERN’s brand recognition in certain fields helps enormously. Use your connections and have confidence!

bright-rec iop pub iop-science physcis connect