Comsol -leaderboard other pages

Topics

To Russia with love

“Why do you give all those secrets to the Russians?” So teases an inebriated Mary Bunemann, confidante to the leading nuclear physicists at the UK’s Atomic Energy Research Establishment, at the emotional climax of Frank Close’s new book Trinity: The Treachery and Pursuit of the Most Dangerous Spy in History. The scene is a party on New Year’s Eve in 1949, in the cloistered laboratory at Harwell, in the Berkshire countryside. With her voice audible across a room populated by his close colleagues and friends, Bunemann unwittingly confronted theoretical physicist Klaus Fuchs with the truth of his double life. As Close’s text suspensefully unfolds, the biggest brain working on Britain’s effort to build a nuclear arsenal had been faced with the very same allegation by an MI5 interrogator just 10 days earlier.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project

Klaus Fuchs began working on nuclear weapons in 1941, when he was recruited by Rudolf Peierls – the “midwife to the atomic age”, in Close’s estimation. Both men were refugees from Nazi Germany. A few years older, and better established in Britain, Peierls would become a friend and mentor to Fuchs. A quarter of a century later, Peierls would also establish a relationship with a young Frank Close, when he arrived at Oxford’s theoretical physics department. Close has now been able to make a poignant contribution to the literature of the bomb by sharing the witness of his connection to the Peierls family, who felt Fuchs’ betrayal bitterly, and were personally affected by the suspicion engendered by his espionage.

Close’s story expands dramatically in scope when Peierls and Fuchs are recruited to the Manhattan Project. Though Peierls was among the first to glimpse the power of atomic weapons, Fuchs began to exceed him in significance to the project during this period. In one of the strongest portions of the book, Close balances physics, politics and the intrigue of shady meetings with Fuchs’ handlers at a time when he passed to the Soviet Union a complete set of instructions for building the first stage of a uranium bomb, a full description of the plutonium bomb used in the Trinity test in the New Mexico desert, and detailed notes on Enrico Fermi’s lectures on the hydrogen bomb.

Intensely claustrophobic

The story becomes intensely claustrophobic when Fuchs returns to England to head the theoretical physics department at Harwell. Here, Close evokes the contradictions in Fuchs’ character: his conviction that nuclear knowledge should be shared between great powers to avert war; his principled but tested faith in communism, awakened while protesting the rise of Nazism; his devoted pastoral care for members of his inner circle at Harwell, even as the net closed around him; and his willingness to share not only nuclear secrets but also the bed of his colleague’s wife. Close has a particular obsession with the question of whether Fuchs’ eventual confession was induced by unrealistic suggestions that he could be forgiven and continue his work. But inducement did not jeopardise Fuchs’ ultimate conviction and imprisonment, despite MI5’s fears, and Close judges his 14-year sentence, later reduced, to be just. Even here, however, the Soviets had the last laugh, with Fuchs’ apprehension not only depriving the British nuclear programme of its greatest intellectual asset, but also precipitating the defection of Bruno Pontecorvo.

Trinity book cover

Close chose an ideal moment to research his history, writing with the benefit of newly released MI5 records, and before several others were withdrawn without notice. He applies forensic attention to the agency’s pursuit of the nuclear spy. Occasionally, however, this is to the detriment of the reader, with events seemingly diffracted onto the pages – both prefigured and returned to as the story progresses and new evidence comes to light. We step through time in Fuchs’ shoes, for example only learning at the end of the book that two other spies at the Manhattan Project were also passing information to the Russians. While Close’s inclination to let the evidence speak for itself is surely the mark of a good physicist, readers in search of a more analytical history may wish to also consult Mike Rossiter’s 2014 biography The Spy Who Changed the World: Klaus Fuchs and the secrets of the nuclear bomb, which offers a more rounded presentation of the Russian and American perspectives.

By bringing physics expertise, personal connections and impressive attention to detail to bear, Frank Close’s latest book has much to offer readers seeking insights into a formative time for the field, when the most talented minds in nuclear physics also bore the weight of world politics on their shoulders. He eloquently tells the tragedy of “the most dangerous spy in history”, as it played out between the trinity of Fuchs, his mentor Peierls and a shadowy network of spooks. Above all, the text is an intimate portrait of the inner struggles of a principled man who betrayed his adopted homeland, even as he grew to love it, and by doing so helped to shape the latter half of the 20th century.

Very high-energy electrons for cancer therapy

Dosimetry experiment for VHEE studies

Radiotherapy (RT) is a fundamental component of effective cancer treatment and control. More than 10,000 electron linear accelerators are currently used worldwide to treat patients with RT, most operating in the low beam-energy range of 5–15 MeV. Usually the electrons are directed at high-density targets to generate bremsstrahlung, and it is the resulting photon beams that are used for therapy. While low-energy electrons have been used to treat cancer for more than five decades, their very low penetration depth tends to limit their application to superficial tumours. The use of high-energy electrons (up to 50 MeV) was studied in the 1980s, but not clinically implemented.

More recently, the idea of using very high-energy (50–250 MeV) electron beams for RT has gained interest. For higher energy electrons, the penetration becomes deeper and the transverse penumbra sharper, potentially enabling the treatment of deep-seated tumours. While the longitudinal dose deposition is also distributed over a larger area, this can be controlled by focusing the electron beam.

The production of very high-energy electrons (VHEE) for RT was the subject of the VHEE 2020 International Workshop, organised by CERN and held remotely from 5–7 October. More than 400 scientists, ranging from clinicians to biologists, and from accelerator physicists to dosimetry experts, gathered virtually to evaluate the perspectives of this novel technique.

FLASH effect

VHEE beams offer several benefits. First, small-diameter high-energy beams can be scanned and focused easily, enabling finer resolution for intensity-modulated treatments than is possible for photon beams. Second, electron accelerators are more compact and significantly cheaper than current installations required for proton therapy. Third, VHEE beams can operate at very high dose rates, possibly compatible with the generation of the “FLASH effect”.

FLASH-RT is a paradigm-shifting method for delivering ultra-high doses within an extremely short irradiation time (tenths of a second). The technique has recently been shown to preserve normal tissue in various species and organs while still maintaining anti-tumour efficacy equivalent to conventional RT at the same dose level, in part due to decreased production of toxic reactive oxygen species. The FLASH effect has been shown to take place with electron, photon and more recently proton beams. However, electron beams promise to deliver an intrinsically higher dose compared to protons and photons, especially over large areas as would be needed for large tumours. Most of the preclinical data demonstrating the increased therapeutic index of FLASH are based on  a single fraction and hypo-fractionated regimen of RT and 4–6 MeV beams, which do not allow treatments of deep-seated tumours and trigger large lateral penumbra. This problem can be solved by increasing the electron energy to values higher than 50 MeV, where the penetration depth is larger.

Today, after three decades of research into linear colliders, it is possible to build compact high-gradient (~100 MV/m) linacs, making a compact and cost effective VHEE RT accelerator a reality. Furthermore, the use of novel accelerator techniques such as laser-plasma acceleration is also starting to be applied in the VHEE field. These are currently the subject of a wide international study, as was presented at the VHEE workshop.

At the same time pioneering preliminary work on FLASH was being carried out by researchers at Lausanne University Hospital (CHUV) in Switzerland and the Curie Institute in France, high-gradient linac technology advances for VHEE were being made at CERN for the proposed Compact Linear Collider (CLIC). An extensive R&D program on normal-conducting radio-frequency accelerating structures has been carried out to obtain the demanding performances of the CLIC linac: an accelerating gradient of 100 MV/m, low breakdown rate, micron-tolerance alignment and a high RF-to-beam efficiency (around 30%). All this is now being applied in the conceptual designs of new RT facilities, such as one jointly being developed by CHUV and CERN. 

Dose profile

High-energy challenges

Many challenges, both technological and biological, have to be addressed and overcome for the ultimate goal of using VHEE and VHEE-FLASH as an innovative modality for effective cancer treatment with minimal damage to healthy tissues. All of these were extensively covered and discussed in the different sessions of VHEE 2020.

From the accelerator-technology point of view an important point is to assess the possibility of focusing and transversely scanning the beam, thereby overcoming the disadvantages associated in the past with low-energy-electron- and photon-beam irradiation. In particular, in the case of VHEE–FLASH it has to be ensured that the biological effect is maintained. Stability, reliability and repeatability are other mandatory ingredients for accelerators to be operated in a medical environment.

The major challenge for VHEE–FLASH is the delivery of a very high dose-rate, possibly over a large area, providing a uniform dose distribution throughout the target. Also the parameter window in which the FLASH effect takes place has still to be thoroughly defined, as does its effectiveness as a function of the physical parameters of the electron beam. This, together with a clear understanding of the underlying biological processes, will likely prove essential in order to fully optimise the FLASH RT technique. Of particular importance, as was repeatedly pointed out during the workshop, is the development of reliable online dosimetry for very high dose rates, a regime not adapted to the current standard dosimetry techniques for RT. Ionisation chambers, routinely used in medical linacs, suffer from nonlinear effects at very high dose rates. To obtain reliable measurements, R&D is needed to develop novel ion chambers or explore alternative possibilities such as solid-state detectors or the use of calibrated beam diagnostics.

All this demands a large test activity across different laboratories to experimentally characterise VHEE beams and their ability to produce the FLASH effect, and to provide a testbed for the associated technologies. It is also important to compare the properties of the electron beams depending on the way they are produced (radio-frequency or laser-plasma accelerator technologies). 

A number of experimental test facilities are already available to perform these ambitious objectives: the CERN Linear Electron Accelerator for Research (CLEAR), so far rather unique in being able to provide both high-energy (50–250 MeV) and high-charge beams; VELA–CLARA at Daresbury Laboratory; PITZ at DESY and finally ELBE–HZDR using the superconducting radio-frequency technology at Dresden. Further radiobiology studies with laser-plasma accelerated electron beams are currently being performed at the DRACO PetaWatt laser facility at the ELBE Center at HZDR-Dresden and at the Laboratoire d’Optique Appliqué in the Institute Polytechnique de Paris. Future facilities, as exemplified by the previously mentioned CERN–CHUV facility or the PHASER proposal at SLAC, are also on the horizon.

Establishing innovative treatment modalities for cancer is a major 21st century health challenge. By 2040, cancer is predicted to be the leading cause of death, with approximatively 27.5 million newly diagnosed patients and 16.3 million related deaths per year. The October VHEE workshop demonstrated the continuing potential of accelerator physics to drive new RT treatments, and also included a lively session dedicated to industrial partners. The large increase in attendance since the first workshop in 2017 in Daresbury, UK, shows the vitality and increasing interest in this field.

CERN takes next step for hadron therapy

SEEIIST Ion Therapy Research Infrastructure

Twenty years ago, pioneering work at CERN helped propel Europe to the forefront of cancer treatment with hadron beams. The Proton Ion Medical Machine Study (PIMMS), founded in 1996 by a CERN–TERA Foundation-MedAustron–Oncology2000 collaboration, paved the way to the construction of two hadron-therapy centres: CNAO in Pavia (Italy) and MedAustron in Wiener Neustadt (Austria). A parallel pioneering development at GSI produced two similar centres in Germany (HIT in Heidelberg and MIT in Marburg). Since the commissioning of the first facility in 2009, the four European hadron-therapy centres have treated more than 10,000 patients with protons or carbon ions. The improved health and life expectancy of these individuals is the best reward to the vision of all those at CERN and GSI who laid the foundations for this new type of cancer treatment.

Almost four million new cancer cases are diagnosed per year in Europe, around half of which can be effectively treated with X-rays at relatively low cost. Where hadrons are advantageous is in the treatment of deep tumours close to critical organs or of paediatric tumours. For these cancers, the “Bragg peak” energy-deposition characteristic of charged particles reduces the radiation dose to organs surrounding the tumour, increasing survival rates and reducing negative side effects and the risk of recurrency. With respect to protons, carbon ions have the additional advantages of hitting the target more precisely with higher biological effect, and of being effective against radioresistant hypoxic tumours, which constitute between 1 and 3% of all radiation-therapy cases. Present facilities treat only a small fraction of all patients who could take advantage of hadron therapy, however. The diffusion of this relatively novel cancer treatment is primarily limited by its cost, and by the need for more pre-clinical and clinical research to fully exploit its potential.

Given these limitations, how can the scientific community contribute to extending the benefits of hadron therapy to a larger number of cancer patients? To review this and similar questions, CERN has recently given a new boost to its medical accelerator activities, after a long interruption corresponding to the time when CERN resources where directed mainly towards LHC construction. The framework for this renewed effort was provided by the CERN Council in 2017 when it approved a strategy concerning knowledge-transfer for the benefit of medical applications. This strategy specifically encouraged new initiatives to leverage existing and upcoming CERN technologies and expertise in accelerator technologies towards the design of a new generation of light-ion accelerators for medicine.

“canted-cosine-theta” coils

The hadron-therapy landscape in 2020 is very different from what it was 20 years ago. The principal reason is that industry has entered the field and developed a new generation of compact cyclotrons for proton therapy. Beyond the four hadron (proton and ion) centres there are now 23 industry-built facilities in Europe providing only proton therapy to about 4000 patients per year. Thanks to this new set of facilities, proton therapy is now highly developed and is progressively extending its reach in competition with more conventional X-ray radiation therapy.

Despite its many advantages over X-rays and protons, therapy with ions (mainly carbon, but other ions like helium or oxygen are under study) is still administered in Europe only by the four large hadron-therapy facilities. In comparison, eight ion-therapy accelerators are in operation in Asia, most of them in Japan, and four others are under construction. The development of new specific instruments for cancer therapy with ions is an ideal application for CERN technologies, in line with CERN’s role of promoting the adoption of cutting-edge technologies that might result in innovative products and open new markets.

Next-generation accelerators

To propel the use of cancer therapy with ions we need a next-generation accelerator, capable of bringing beams of carbon ions to the 430 MeV/u energy required to cover the full body, with smaller dimensions and cost compared to the PIMMS-type machines. A new accelerator design with improved intensity and operational flexibility would also enable a wide research programme to optimise ion species and treatment modalities, in line with what was foreseen by the cancelled BioLEIR programme at CERN. This would allow the exploration of innovative paths to the treatment of cancer such as ultra-short FLASH therapy or the promising combination of ion therapy with immunotherapy, which is expected to trigger an immune response against diffused cancers and metastasis. Moreover, a more compact accelerator could be installed in, or very close to, existing hospitals to fully integrate ion therapy in cancer-treatment protocols while minimising the need to transport patients over long distances.

The development of new specific instruments for cancer therapy with ions is an ideal application for CERN technologies

These considerations are the foundation for the Next Ion Medical Machine Study (NIMMS), a new CERN initiative that aims to develop specific accelerator technologies for the next generation of ion-therapy facilities and help catalyse a new European collective action for therapy with ion beams. The NIMMS activities were launched in 2019, following a workshop at ESI Archamps in 2018 where the medical and accelerator communities agreed on basic specifications for a new-generation machine. In addition to smaller dimensions and cost, these include a higher beam current for faster treatment, operation with multiple ions, and irradiation from different angles using a gantry system.

In addressing the challenges of new designs with reduced dimensions, CERN is building on the development work promoted in the last decade by the TERA Foundation. Reducing the accelerator dimensions from the conventional synchrotrons used so far can take different directions, out of which two are particularly promising. The first is the classic approach of using superconductivity to increase the magnetic field and decrease the radius of the synchrotron, and the second consists of replacing the synchrotron with a high-gradient linear accelerator with a new design – in line with the proton therapy linac being developed by ADAM, a spin-off company of CERN and TERA now part of the AVO group. The goal in both designs is to reduce the surface occupied by the accelerator by more than a factor of two, from about 1200 to 500 m2. With these considerations in mind, the NIMMS study has been structured in four work packages.

The main avenue to reduced dimensions is superconductivity, and the goal of the first work package is to develop new superconducting magnet designs for pulsed operation, with large apertures and curvatures – suitable for an ideal “square” synchrotron layout with only four 90 degree magnets. Different concepts are being explored, with some attention to the so-called canted cosine-theta design (see “Combined windings”) used for example in orbit correctors for the high-luminosity LHC, of which a team at Lawrence Berkeley National Laboratory has recently developed a curved prototype for medical applications. Other options under study are based on more traditional cosine-theta designs (see “Split yoke”), and on exploiting the potential of modern high-temperature superconductors. 

curved cosine-theta dipole

The second work package covers the design of a compact linear accelerator optimised for installation in hospitals. Operating at 3 GHz with high field gradients, this linac design profits from the expertise gained with accelerating structures developed for the proposed Compact Linear Collider (CLIC), and uses as an injector a novel source for fully-stripped carbon based on the REX-ISOLDE design. The source is followed by a 750 MHz radio-frequency quadrupole using the design recently developed at CERN for medical and industrial applications.

The third NIMMS work package focuses on compact superconducting designs for the gantry, the large element required to precisely deliver ion beams to the patient that is critical for the cost and performance of an ion-therapy facility. The problem of integrating a large-acceptance beam optics with a compact superconducting magnetic system within a robust mechanical structure is an ideal challenge for the expertise of the CERN accelerator groups. Two designs are being considered: a lightweight rotational gantry covering only 180 degrees originally proposed by TERA, and the GaToroid toroidal gantry being developed at CERN.

NIMMS will consider new designs for the injector linac, with reduced cost and dimensions

The fourth work package is dedicated to the development of new high-current synchrotron designs, and to their integration in future cancer research and therapy facilities. To reduce treatment time, the goal is to accelerate more than an order of magnitude higher current than in the present European facilities. This requires careful multi-turn injection into the ring and strict control of beam optics, which add to other specific features of the new design, including a fast extraction that will make tests with the new ultra-fast FLASH treatment modality possible. Two synchrotron layouts are being considered, a more conventional one with room-temperature magnets (see “Ions for therapy”), and a very compact superconducting one of only 27 m circumference. The latter, equipped with a gantry of new design, would allow a single-room carbon-therapy facility to be realised in an area of about 1000 m2. Additionally, NIMMS will consider new designs for the injector linac, with reduced cost and dimensions and including the option of being used for production of medical radioisotopes – for imaging and therapy – during the otherwise idle time between two synchrotron injections.

Ambitious work plan

This ambitious work plan exceeds the resources that CERN can allocate to this study, and its development requires collaborations at different levels. The first enthusiastic partner is the new SEEIIST (South East European International Institute for Sustainable Technologies) organisation, which aims at building a pan-European facility for cancer research and therapy with ions (see “Ions for therapy”). SEEIIST is already joining forces with NIMMS by supporting staff working at CERN on synchrotron and gantry design. The second partnership is with the ion therapy centres CNAO and MedAustron, which are evaluating the proposed superconducting gantry design in view of extending the treatment capabilities of their facilities. A third critical partner is CIEMAT, which will build the high-frequency linac pre-injector and validate it with beam. Other partners participating in the study at different levels are GSI, PSI, HIT, INFN, Melbourne University, Imperial College, and of course TERA which remains one of the driving forces behind medical-accelerator developments. This wide collaboration has been successful in attracting additional support from the European Commission via two recently approved projects beginning in 2021. The multidisciplinary HITRIplus project on ion therapy includes work packages dedicated to accelerator, gantry and superconducting magnet design, while the IFAST project for cutting-edge accelerator R&D contains an ambitious programme focusing on the optimisation and prototyping of superconducting magnets for ion therapy with industry.

Every technology starts from a dream, and particle accelerators are there to fulfil one of the oldest: looking inside the human body and curing it without bloodshed. It is up to us to further develop the tools to realise this dream.

Adapting CLIC tech for FLASH therapy

Walter Wuensch

About 30–40% of people will develop cancer during their lifetimes. Surgery, chemotherapy, immunotherapy and radiotherapy (RT) are used to cure or manage the disease. But around a third of cancers are multi-resistant to all forms of therapies, defining a need for more efficient and better tolerated treatments. Technological advances in the past decade or so have transformed RT into a precise and powerful treatment for cancer patients. Nevertheless, the treatment of radiation-resistant tumours is complicated by the need to limit doses to surrounding normal tissue.

A paradigm-shifting technique called FLASH therapy, which is able to deliver doses of radiation in milliseconds instead of minutes as for conventional RT, is opening new avenues for more effective and less toxic RT. Pre-clinical studies have shown that the extremely short exposure time of FLASH therapy spares healthy tissue from the hazardous effect of radiation without reducing its efficacy on tumours.

First studied in the 1970s, it is only during the past few years that FLASH therapy has caught the attention of oncologists. The catalyst was a 2014 study carried out by researchers from Lausanne University Hospital (CHUV), Switzerland, and from the Institute Curie in Paris, which showed an outstanding differential FLASH effect between tumours and normal tissues in mice. The results were later confirmed by several other leading institutes. Then, in 2019, CHUV used FLASH to treat a multi-resistant skin cancer in a human patient, causing the tumour to completely disappear with nearly no side effects.

The consistency of pre-clinical data showing a striking protection of normal tissues with FLASH compared to conventional RT offers a new opportunity to improve cancer treatment, especially for multi-resistant tumours. The very short “radiation beam-on-time” of FLASH therapy could also eliminate the need for motion management, which is currently necessary when irradiating tumours that move with respiration. Furthermore, since FLASH therapy operates best with high single doses, it requires only one or two RT sessions as opposed to multiple sessions over a period of several weeks in the case of conventional RT. This promises to reduce oncology workloads and patient waiting lists, while improving treatment access in low-population density environments. Altogether, these advantages could turn FLASH therapy into a powerful new tool for cancer treatment, providing a better quality of life for patients.

The key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility

CERN and CHUV join forces

CHUV is undertaking a comprehensive research program to translate FLASH therapy to a clinical environment. No clinical prototype is currently available for treating patients with FLASH therapy, especially for deep-seated tumours. Such treatments require very high-energy beams (see p12) and face technological challenges that can currently be solved only by a very limited number of institutions worldwide. As the world’s largest particle-physics laboratory, CERN is one of them. In 2019, CHUV and CERN joined forces with the aim of building a high-energy, clinical FLASH facility.

The need to deliver a full treatment dose over a large area in a short period of time demands an accelerator that can produce a high-intensity beam. Amongst the current radiation tools available for RT – X-rays, electrons, protons and ions – electrons stand out for their unique combination of attributes. Electrons with an energy of around 100 MeV penetrate many tens of centimetres in tissue so have the potential to reach tumours deep inside the body. This is also true for the other radiation modalities but it is technically simpler to produce intense beams of electrons. For example, electron beams are routinely used to produce X-rays in imaging systems such as CT scanners and in industrial applications such as electron beam-welding machines. In addition, it is comparatively simple to accelerate electrons in linear accelerators and guide them using modest magnets. A FLASH-therapy facility based on 100 MeV-range electrons is therefore a highly compelling option.

Demonstrating the unexpected practical benefits of fundamental research, the emergence of FLASH therapy as a potentially major clinical advance coincides with the maturing of accelerator technology developed for the CLIC electron–positron collider. In a further coincidence, the focus of FLASH development has been at CHUV, in Lausanne, and CLIC development at CERN, in Geneva, just 60 km away. CLIC is one of the potential options for a post-LHC collider and the design of the facility, as well as the development of key technologies, has been underway for more than 20 years. A recent update of the design, now optimized for a 380 GeV initial-energy stage, and updated prototype testing were completed in 2018.

Despite the differences in scale and application, the key requirements for CLIC correspond astonishingly well with the requirements for a FLASH facility. First, CLIC requires high-luminosity collisions, for example to allow the study of rare interaction processes. This is achieved by colliding very high-intensity and precisely controlled beams: the average current during a pulse of CLIC is 1 A and the linac hardware is designed to allow two beams less than 1 nm in diameter to collide at the interaction point. High levels of current that are superbly controlled are also needed for FLASH to cover large tumours in short times. Second, CLIC requires a high accelerating gradient (72 MV/m in the initial stage) to achieve its required collision energy in a reasonably sized facility (11 km for a 380 GeV first stage). A FLASH facility using 100 MeV electrons based on an optimised implementation of the same technology requires an accelerator of just a couple of metres long. Other system elements such as diagnostics, beam shaping and delivery as well as radiation shielding make the footprint of the full facility somewhat larger. Overall, however, the compact accelerator technology developed for CLIC gives the possibility of clinical facilities built within the confines of typical hospital campus and integrated with existing oncology departments.

Over the decades, CLIC has invested significant resources into developing its high-current and high-gradient technology. Numerous high-power radio-frequency test stands have been built and operated, serving as prototypes for the radio-frequency system units that make up a linear accelerator. The high-current-beam test accelerator “CTF3” enabled beam dynamic simulation codes to be benchmarked and the formation, manipulation and control of very intense electron beams to be demonstrated. Further beam-dynamics validations and relevant experiments have been carried out at different laboratories including ATF2 at KEK, FACET at SLAC and ATF at Argonne. CERN also operates the Linear Electron Accelerator for Research (CLEAR) facility, where it can accelerate electrons up to 250 MeV, thus matching the energy requirements of FLASH radiotherapy. For the past several years, and beyond the collaboration between CERN and CHUV, the CLEAR facility has been involved in dosimetry studies for FLASH radiotherapy. 

Towards a clinical facility

All of this accumulated experience and expertise is now being used to design and construct a FLASH facility. The collaboration between CERN and CHUV is a shining example of knowledge transfer, where technology developed for fundamental research is used to develop a therapeutic facility. While the technical aspects of the project have been defined via exchanges between medical researchers and accelerator experts, the CERN knowledge-transfer group and CHUV’s management have addressed contractual aspects and identified a strategy for intellectual property ownership. This global approach provides a clear roadmap for transforming the conceptual facility into a clinical reality. From the perspective of high-energy physics, the adoption of CLIC technology in commercially supplied medical facilities would significantly reduce technological risk and increase the industrial supplier base. 

An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed

The collaboration between CHUV and CERN was catalysed by a workshop on FLASH therapy hosted by CHUV in September 2018, when it was realised that an electron-beam facility based on CLIC technology offers the possibility for a high-performance clinical FLASH facility. An interdisciplinary team comprising medical doctors, medical physicists, radiation biologists and accelerator physicists and engineers was formed to study the possibilities in greater depth. In an intense exchange during the months following the workshop, where requirements and capabilities were brought together and balanced, a clear picture of the parameters of a clinical FLASH facility emerged. Subsequently, the team studied critical issues in detail, validating that such a facility is in fact feasible. It is now working towards the details of a baseline design, with parameters specified at the system level, and the implementation of entirely new perspectives that were triggered by the study. A conceptual design report for the facility will be finished by the end of 2020. CHUV is actively seeking funding for the facility, which would require approximately three years for construction through beam commissioning.

The basic accelerator elements of the 100 MeV-range FLASH facility that emerged from this design process consist of: a photo-injector electron source; a linac optimised for high-current transport and maximum radio-frequency-power to beam-energy-transfer efficiency; and a beam-delivery system which forms the beam shape for individual treatment and directs it towards the patient. In addition, accelerator and clinical instrumentation are being designed which must work together to provide the necessary level of precision and repeatability required for patient treatment. This latter issue is of particular criticality in FLASH treatment, which must be administered with all feedback and correction of delivered dose to clinical levels completed in substantially less than a second. The radiation field is one area where the requirements of CLIC and FLASH are quite different. In CLIC the beam is focused to a very small spot (roughly 150 nm wide and 3 nm high) for maximum luminosity, whereas in FLASH the beam must be expanded to cover a large area (up to 10 cm) of irregular cross section and with high levels of dose uniformity. Although this requires a very different implementation of the beam-delivery systems, both CLIC and FLASH are designed using the same beam-dynamics tools and design methodologies. 

Many challenges will have to be overcome, not least obtaining regulatory approval for such a novel system, but we are convinced that the fundamental ideas are sound and that the goal is within reach. A clinical FLASH facility based on CLIC technology is set to be an excellent example of the impact of developments made in the pursuit of fundamental science can have in society.

LHCb sheds light on Vub puzzle

The Cabibbo–Kobayashi–Maskawa (CKM) matrix element Vub describes the coupling between u and b quarks in the weak interaction, and is one of the fundamental parameters of the Standard Model (SM). Though it was first observed to be non-zero 30 years ago, its value is still debated. |Vub| determines the length of the least well-known side of the corresponding unitarity triangle, and is therefore a key ingredient for testing the consistency of the SM in the flavour sector. LHCb has recently published a new result on |Vub| using the first ever measurement of the Bs0 → Kμ+νμ decay.

LHCb Jan/Feb 2021 fig 1

|Vub| and |Vcb| are the focus of a longstanding puzzle. When comparing the world-average values derived from inclusive and exclusive B-meson decays, respectively, the inclusive and exclusive measurements disagree by more than three standard deviations, for measurements of both |Vub| and |Vcb|. Traditionally, the exclusive |Vub| determination requires the reconstruction of the semileptonic b → u decay B0 → πμ+νμ. LHCb also has access to Bs0 meson and b-baryon decays, but the missing neutrino makes it difficult to isolate the signal from the copious background. Defying expectations, however, in 2015 LHCb managed to observe the Λb0 → pμνμ decay, and used the normalisation channel Λb0 → Λ+cμνμ to determine |Vub|/ |Vcb|. The main difficulty in this type of analysis resides in the fact that only two charged particles are reconstructed in decays such as Bs0 → Kμ+νμ and Λb0 → pμνμ. A huge background arising from other sources dominates the selected data sample. Machine-learning algorithms are therefore used to isolate the signal from the various background categories consisting of decays with additional charged and/or neutral particles in the final state. The remaining irreducible background is modelled by using both simulation and control samples extracted from data.

This is the first experimental test of the form-factor calculations

First observation

In a recent paper, the LHCb collaboration presented the first observation of the decay Bs0 → Kμ+νμ. The decay Bs0 → Ds μ+νμ is used as a normalisation channel to minimise experimental systematic uncertainties. The study was performed in two regions of the squared invariant mass (or momentum transfer) q2 of the muon and the neutrino below and above 7 GeV2. The observed total yield was about 13,000 events, corresponding to a branching fraction of (1.06 ± 0.10) × 10–4, of which about one third stemmed from the low q2 range (figure 1).

LHCb Jan/Feb 2021 fig 2

The extraction of the ratio |Vub|/|Vcb| requires external knowledge of the form factors describing the strong Bs0 → K and Bs0 → Ds transitions, to account for the interactions of the quarks bound in mesons. These vary with the momentum transfer and are calculated using non-perturbative techniques, such as lattice QCD (LQCD) and light-cone sum rules (LCSR). As LQCD and LCSR calculations are more accurate at high and low q2, respectively, they are used in the corresponding q2 regions. The obtained value of |Vub|/|Vcb| = 0.095 ± 0.008 in the high q2 interval shows agreement with the world average of exclusive measurements, and with the LHCb result using Λb0 → pμνμ decays, while in the low q2 region, |Vub|/|Vcb| = 0.061 ± 0.004 is significantly lower (figure 2). This is the first experimental test of the form-factor calculations, and new results are expected in the near future. These will help settle the exclusive versus inclusive debate surrounding the values of |Vub| and |Vcb|, and provide further constraints on the unitarity triangle.

A long-lived paradigm shift

Searches for new physics at high-energy colliders traditionally target heavy new particles with short lifetimes. These searches determine detector design, data acquisition and analysis methods. However, there could be new long-lived particles (LLPs) which travel through the detectors without decaying, either because they are light or have small couplings. Searches for LLPs have been going on at the LHC since the start of data taking, and at previous colliders, but they are attracting increasing interest in recent times, more so in light of the lack of new particles discovered in more mainstream searches.

Detecting LLPs at the LHC experiments requires a paradigm shift with respect to the usual data-analysis and trigger strategies. To that end, more than 200 experimentalists and theorists met online from 16 to 19 November for the eighth workshop of the LHC LLP community.

Dark quarks would undergo fragmentation and hadronisation, resulting in “dark showers”

Strong theoretical motivations underpin searches for LLPs. For example, dark matter could be part of a larger dark sector, parallel to the Standard Model (SM), with new particles and interactions. If dark quarks could be produced at the LHC, they would undergo fragmentation and hadronisation in the dark sector resulting in characteristic “dark showers” — one of the focuses of the workshop. Collider signatures for dark showers depend on the fraction of unstable particles they contain and their lifetime, with a range of categories presenting their own analysis challenges: QCD-like jets, semi-visible jets, emerging jets, and displaced vertices with missing transverse energy. Delegates agreed on the importance of connecting collider-level searches for dark showers with astrophysical and cosmological scales. In a similar spirit of collaboration across communities, a joint session with the HEP Software Foundation focused on triggering and reconstruction software for dedicated LLP detectors.

Heavy neutral leptons

The discovery of heavy neutral leptons (HNLs) could address different open questions of the SM. For example, neutrinos are expected to be left-handed and massless in the SM, but oscillate between flavours as their wavefunction evolves, providing evidence for as-yet immeasurably small masses. One way to fix this problem is to complete the field pattern of the SM with right-handed HNLs. The number and other characteristics of HNLs depend on the model considered, but in many cases HNLs are long-lived and connect to other important questions of the SM, such as dark matter and the baryon asymmetry of the universe. There are many ongoing searches for HNLs at the LHC and many more proposed elsewhere. During the November workshop the discussion touched on different models and simulations, reviewing what is available and what is needed for the different signal benchmarks.

Another focus was the reinterpretation of previous LLP searches. Recasting public results is common practice at the LHC and a good way to increase physics impact, but reinterpreting LLP searches is more difficult than prompt searches due to the use of non-standard selections and analysis-specific objects.

 

The latest results from CERN experiments were presented. ATLAS reported the first LHC search for sleptons using displaced-lepton final states, greatly improving sensitivity compared to LEP. CMS presented a search for strongly interacting massive particles with trackless jets, and a search for long-lived particles decaying to jets with displaced vertices. LHCb reported searches for low -mass di-muon resonances and a search for heavy neutrinos in the decay of a W boson into two muons and a jet, and the NA62 experiment at CERN’s SPS presented a search for π0 decays to invisible particles. These results bring important new constraints on the properties and parameters of LLP models.

Dedicated detectors

A series of dedicated LLP detectors at CERN — including the Forward Physics Facility for the HL-LHC, the CMS forward detector, FASER, Codex-b and Codex-ß, MilliQan, MoEDAL-MAPP, MATHUSLA, ANUBIS, SND@LHC, and FORMOSA – are in different stages between proposal and operation. These additional detectors, located at various distances from the LHC experiments, have diverse strengths: some, like MilliQan, look for specific particles (milli-charged particles, in that case), whereas others, like Mathusla, offer a very low background environment in which to search for neutral LLPs. These complementary efforts will, in the near future, provide all the different pieces needed to build the most complete picture possible of a variety of LLP searches, from axion-like particles to exotic Higgs decays, potentially opening the door to a dark sector.

ATLAS reported the first LHC search for sleptons using displaced-lepton final states

The workshop featured a dedicated session on future colliders for the first time. Designing these experiments with LLPs in mind would radically boost discovery chances. Key considerations will be tracking and the tracking volume, timing information, trigger and DAQ, as well as potential additional instrumentation in tunnels or using the experimental caverns.

Together with the range of new results presented and many more in the pipeline, the 2020 LLP workshop was representative of a vibrant research community, constantly pushing the “lifetime frontier”.

Nuclear win for ISOLDE physicists

2020 Lise Meitner winners

The nuclear physics division of the European Physical Society today awarded the 2020 Lise Meitner Prize to three physicists who have played a decisive role in turning a small-scale nuclear-physics experiment at CERN into a world-leading facility for the investigation of nuclear structure.

Klaus Blaum (Max Planck Institute for Nuclear Physics), Björn Jonson (Chalmers University of Technology) and Piet Van Duppen (KU Leuven) are recognised for the development and application of online instrumentation and techniques, and for the precise and systematic investigation of properties of nuclei far from stability at CERN’s Isotope mass Separator On-Line facility (ISOLDE).

Blaum has made key contributions to the high-precision determination of nuclear ground state properties with laser and mass spectroscopic methods and to the development of new techniques in this field, while Jonson was acknowledged for his studies of the lightest exotic nuclei, namely halo nuclei, where he was the first to explain its surprisingly large matter radius. Van Duppen was recognised for his push in the production and investigation of post-accelerated radioactive beams with REX-ISOLDE. Since the 1960s, the ISOLDE user facility has produced extreme nuclear systems to help physicists understand how the strong interaction binds the ingredients of atomic nuclei, with advanced traps and lasers recently offering new ways to look for physics beyond the Standard Model.

I’m very impressed by the breadth of the recent prize winners

Eckhard Elsen

The biennial Lise Meitner prize, named after one of the pioneers in the discovery of nuclear fission in 1939, was established in 2000 to acknowledge outstanding work in the fields of experimental, theoretical or applied nuclear science. Former winners include a quartet of physicists (Johanna Stachel, Peter Braun-Munzinger, Paolo Giubellino and Jürgen Schukraft) from the ALICE collaboration in 2014, for the experimental exploration of the quark-gluon plasma using ultra-relativistic nucleus-nucleus collisions, and for the design and construction of the ALICE detector.

This year’s awards were officially presented during the 2020 ISOLDE workshop and users meeting held online on 26-27 November. “I’m very impressed by the breadth of the recent prize winners….covering a range of topics and varying between individuals and teams,” said CERN director for research and computing Eckhard Elsen during the award ceremony. “It is a good indicator of the health and the push of the field – it is truly alive.

A unique period for computing, but will it last?

Monica Marinucci and Ivan Deloose

Twenty-five years ago in Rio de Janeiro, at the 8th International Conference on Computing in High-Energy and Nuclear Physics (CHEP-95), I presented a paper on behalf of my research team titled “The PC as Physics Computer for LHC”. We highlighted impressive improvements in price and performance compared to other solutions on offer. In the years that followed, the community started moving to PCs in a massive way, and today the PC remains unchallenged as the workhorse for high-energy physics (HEP) computing.

HEP-computing demands have always been greater than the available capacity. However, our community does not have the financial clout to dictate the way computing should evolve, demanding constant innovation and research in computing and IT to maintain progress. A few years before CHEP-95, RISC workstations and servers had started complementing the mainframes that had been acquired at high cost at the start-up of LEP in 1989. We thought we could do even better than RISC. The increased-energy LEP2 phase needed lots of simulation, and the same needs were already manifest for the LHC. These were our inspirations that led PC servers to start populating our computer centres – a move that was also helped by a fair amount of luck.

Fast change

HEP programs need good floating-point compute capabilities and early generations of the Intel x86 processors, such as the 486/487 chips, offered mediocre capabilities. The Pentium processors that emerged in the mid-1990s changed the scene significantly, and the competitive race between Intel and AMD was a major driver of continued hardware innovation.

Another strong tailwind came from the relentless efforts to shrink transistor sizes in line with Moore’s law, which saw processor speeds increase from 50/100 MHz to 2000/3000 MHz in little more than a decade. After 2006, when speed increases became impossible for thermal reasons, efforts moved to producing multi-core chips. However, HEP continued to profit. Since all physics events at colliders such as the LHC are independent of all others, it was sufficient to split a job into multiple jobs across all cores.

Sverre Jarp

The HEP community was also lucky with software. Back in 1995 we had chosen Windows/NT as the operating system, mainly because it supported multiprocessing, which significantly enhanced our price/performance. Physicists, however, insisted on Unix. In 1991, Linus Thorvalds released Linux version 0.01 and it quickly gathered momentum as a worldwide open-source project. When release 2.0 appeared in 1996, multiprocessing support was included and the operating system was quickly adopted by our community.

Furthermore, HEP adopted the Grid concept to cope with the demands of the LHC. Thanks to projects such as Enabling Grids for E-science, we built the Worldwide LHC Computing Grid, which today handles more than two million tasks across one million PC cores every 24 hours. Although grid computing remained mainly amongst scientific users, the analogous concept of cloud computing had the same cementing effect across industry. Today, all the major cloud-computing providers overwhelmingly rely on PC servers.

In 1995 we had seen a glimmer, but we had no idea that the PC would remain an uncontested winner during a quarter of a century of scientific computing. The question is whether it will last for another quarter century?

The contenders

The end of CPU scaling, argued a recent report by the HEP Software Foundation, demands radical changes in computing and software to ensure the success of the LHC and other experiments into the 2020s and beyond. There are many contenders that would like to replace the x86 PC architecture. It could be graphics processors, where both Intel, AMD and Nvidia are active. A wilder guess is quantum computing, whereas a more conservative guess would be processors similar to the x86, but based on other architectures, such as ARM or RISC-V.

The end of CPU scaling demands radical changes to ensure the success of the LHC and other high-energy physics experiments

During the PC project we collaborated with Hewlett-Packard, which had a division in Grenoble, not too far away. Such R&D collaborations have been vital to CERN and the community since the beginning and they remain so today. They allow us to get insight into forthcoming products and future plans, while our feedback can help to influence the products in plan. CERN openlab, which has been the focal point for such collaborations for two decades, early-on coined the phrase “You make it, we break it”. However, whatever the future holds, it is fair to assume that PCs will remain the workhorse for HEP computing for many years to come.

Beating cardiac arrhythmia

EBAMed’s technical team

In December last year, a beam of protons was used to treat a patient with cardiac arrhythmia – an irregular beating of the heart that affects around 15 million people in Europe and North America alone. The successful procedure, performed at the National Center of Oncological Hadrontherapy (CNAO) in Italy, signalled a new application of proton therapy, which has been used to treat upwards of 170,000 cancer patients worldwide since the early 1990s.

In parallel to CNAO – which is based on accelerator technologies developed in conjunction with CERN via the TERA Foundation – a Geneva-based start-up called EBAMed (External Beam Ablation) founded by CERN alumnus Adriano Garonna aims to develop and commercialise image-guidance solutions for non-invasive treatments of heart arrhythmias. EBAMed’s technology is centred on an ultrasound imaging system that monitors a patient’s heart activity, interprets the motion in real time and sends a signal to the proton-therapy machine when the radiation should be sent. Once targeted, the proton beam ablates specific heart tissues to stop the local conduction of disrupted electrical signals.

Fast learner

“Our challenge was to find a solution using the precision of proton therapy on a fast and irregular moving target: the heart,” explains Garonna. “The device senses motion at a very fast rate, and we use machine learning to interpret the images in real time, which allows robust decision-making.” Unlike current treatments, which can be lengthy and costly, he adds, people can be treated as outpatients; the intervention is non-invasive and “completely pain-free”.

The recipient of several awards – including TOP 100 Swiss Startups 2019, Venture Business Plan 2018, MassChallenge 2018, Venture Kick 2018 and IMD 2017 Start-up Competition – EBAMed recently received a €2.4 million grant from the European Union to fund product development and the first human tests.

Garonna’s professional journey began when he was a summer student at CERN in 2007, working on user-interface software for a new optical position-monitoring system at LHC Point 5 (CMS). Following his graduation, Garonna returned to CERN as a PhD student with the TERA Foundation and École Polytechnique Fédérale de Lausanne, and then as a fellow working for the Marie Curie programme PARTNER, a training network for European radiotherapy. This led to a position as head of therapy accelerator commissioning at MedAustron in Austria – a facility for proton and ion therapy based, like CNAO, on TERA Foundation/CERN technology. After helping deliver the first patient treatments at MedAustron, Garonna returned to CERN and entered informal discussions with TERA founder Ugo Amaldi, who was one of Garonna’s PhD supervisors, about how to take the technology further. Along with former CERN engineer Giovanni Leo and arrhythmia expert Douglas Packer, the group founded EBAMed in 2018.

“Becoming an entrepreneur was not my initial purpose, but I was fascinated by the project and convinced that a start-up was the best vehicle to bring it to market,” says Garonna. Not having a business background, he benefitted from the CERN Knowledge Transfer entrepreneurship seminars as well as the support from the Geneva incubator Fongit and courses organised by Innosuisse, the Swiss innovation agency. Garonna also drew on previous experience gained while at CERN. “At CERN most of my projects involved exploring new areas. While I benefitted from the support of my supervisors, I had to drive projects on my own, seek the right solutions and build the appropriate ecosystem to obtain results. This certainly developed an initiative-driven, entrepreneurial streak in me.”

Healthy competition

Proton therapy is booming, with almost 100 facilities operating worldwide and more than 35 under construction. EBAMed’s equipment can be installed in any proton-therapy centre irrespective of its technology, says Garonna. “We already have prospective patients contacting us as they have heard of our device and wish to benefit from the treatment. As a company, we want to be the leaders in our field. We do have a US competitor, who has developed a planning system using conventional radiotherapy, and we are grateful that there is another player on the market as it helps pave the way to non-invasive treatments. Additionally, it is dangerous to be alone, as that could imply that there is no market in the first place.”

Leaving the security of a job to risk it all with a start-up is a gradual process, says Garonna. “It’s definitely challenging to jump into what seems like cold water… you have to think if it is worth the journey. If you believe in what you are doing, I think it will be worth it.”

In pursuit of the possible

Giulia Zanderighi

What do collider phenomenologists do?

I tend to prefer the term particle phenomenology because the collider is just the tool that we use. However, compared to other experiments, such as those searching for dark matter or axions, colliders provide a controlled laboratory where you decide how many collisions and what energy these collisions should have. This is quite unique. Today, accelerators and detectors have reached an immense level of sophistication, and this allows us to perform a vast amount of fundamental measurements. So, the field spans precision measurements of fundamental properties of particles, in particular of the Higgs boson, consistency tests of the Standard Model (SM), direct and indirect searches for new physics, measurements of rare decays, and much more. For essentially all these topics we have had new results in recent years, so it’s a very active and continuously evolving field. But of course we do not just measure things for the sake of it. We have big, fundamental questions and we are looking for hints from LHC data as to how to address them.

Whats hot in the field today?

One topic that I think is very cool is that we can benefit from the LHC, in its current setup, also as lepton collider. In fact, at the LHC we are looking at elementary collisions between the proton’s constituents, quarks and gluons. But since the proton is charged, it also emits photons, and one can talk about the photon parton-distribution function (PDF), i.e. the photonic content of protons. These photons can split into lepton pairs, so when one collides protons one is also colliding leptons. The fascinating thing is that the “content” of leptons in protons is rather democratic, so one can look at collisions between, say, a muon and a tau lepton – something that can’t be done even at future proposed lepton colliders. Furthermore, by picking up a lepton from one proton and a quark from the other proton, one can place new constraints on leptoquarks, and plenty of other things. This idea was already proposed in the 1990s, but was essentially forgotten because the lepton PDF was not known. Now we know this very precisely, bringing new possibilities. But let me stress that this is just one idea – there are many other new ideas out there. For instance, one major branch of phenomenology is to use machine learning or deep learning to recognise the SM and extract from data what is not SM-like.

I’m the first female director, which of course is a great responsibility

How does the Max Planck Institute differ from your previous positions, for example at CERN and Oxford?

A long time ago, somebody told me that the best thing that can happen to you in Germany is the Max Planck Society. It’s true. You are given independence and the means to fully focus on research and ideas, largely free of teaching duties or the need to apply for grants. Also, there are very valuable interactions with universities, be it in research or via the International Max Planck Research Schools for PhD students. Our institute in Munich is a very unique place. One can feel it immediately. As a guest in the theory department, for example, you get to sit in the Heisenberg office, which feels like going back in time. Our institute was founded in Berlin in 1917 with Albert Einstein as a first director. In 1958 the institute moved to Munich with Werner Heisenberg as director. After more than 100 years I’m the first female director, which of course is a great responsibility. But I also really loved both CERN and Oxford. At CERN I felt like I was at the centre of the world. It is such a vibrant environment, and I loved the proximity to the experiments and the chats in the cafeteria about calculations or measurements. In Oxford I loved the multidisciplinary aspect, the dinners in college sitting next to other academics working in completely different fields. I guess I’m lucky that I’ve been in so many and such different places.

What is the biggest challenge to reach higher precision in quantum-field-theory calculations of key SM processes?

Scattering processes

The biggest challenge is that often there is no single biggest challenge. For instance, for inclusive Higgs-boson production we have a number of theoretical uncertainties, but they are all quite comparable in size. This means that to reduce the overall uncertainty considerably, one needs to reduce all uncertainties, and they all have very different physics origins and difficulties – from a better understanding of the incoming parton densities and a better knowledge of the strong coupling constant, to higher order QCD or electroweak effects and effects related to heavy particles in virtual loops, etc. Computing power can be a liming factor for certain calculations, so making things numerically more efficient is also important. One of the main goals of the coming year will be the calculation of two to three scattering processes at the LHC at next-to-next-to-leading order (NNLO) in QCD. For instance, a milestone will be the calculation of top-pair production in association with a Higgs boson at that level of accuracy. This is the process where we can measure most directly the top-Yukawa coupling. The importance of this measurement can’t be overstressed. While the big discovery at the LHC is so far the Higgs boson, one should also remember that the Yukawa interaction is a new type of fundamental interaction, which is proportional to the mass of the particle, just like gravity, but yet so different from gravity. For some calculations, NNLO is already enough in terms of perturbative precision; going to N3LO doesn’t really add much yet. But there are a few cases where it helps already, such as super-clean Drell–Yan processes.

Is there a level of precision of LHC measurements beyond which indirect searches for new physics are no longer fruitful?

We will never rule out precision measurements as a route to search for new physics. We can always extend the reach and enhance the sensitivity of indirect searches. By increasing precision, we are exploring deeper in the ultraviolet region, where we can start to become sensitive to states exchanged in the loops that are more and more heavy. There is a limit, but we are very far from it. It’s like looking with a better and better microscope: the better the resolution, the more one can explore. However, the experimental precision has to go hand in hand with theoretical precision, and this is where the real challenge for phenomenologists lies. Of course, if you have a super precise measurement but no theory prediction, or vice versa, then you can’t do much with it. With the Higgs boson I am confident that the theory calculations will not be the deal breaker. We will eventually hit the wall in terms of experimental precision, but you can’t put a figure on where this will happen. Until you see a deviation you are never really done.

How would you characterise the state of particle physics today?

When I entered the field as a student, there were high expectations that supersymmetry would be discovered quickly at the LHC. Now things have turned out to be different, but this is what makes it exciting and challenging – even more so, because the same mysteries are still there. We have big, fundamental questions. We have hints from theory, from experiments. We have a powerful, multi-purpose machine – the LHC – that is only just getting started and will provide much more data. Of course, expectations like the quick discovery of supersymmetry have not been fulfilled, but nature is how it is. I think that progress in physics is driven by experiments. We have beautiful exceptions where progress comes from theory, like general relativity, or the postulation of the mechanism for electroweak symmetry breaking. When I think about the Higgs mechanism, I am still astonished that such a simple and powerful idea postulated in 1964 turns out to be realised in nature. But these cases, where theory precedes experiments, are the exception not the rule. In most cases progress in physics comes from observations. After all, it is a natural science, it is not mathematics.

There are some questions that are really tough, and we may never really see an answer to. But with the LHC there are many other smaller questions we certainly can address, such as understanding the proton structure or studying the interaction potential between nucleons and strange baryons, which are relevant to understand the physics of neutron stars, and these are still advancing knowledge. The brightest minds are attracted to the biggest problems, and this will always draw young researchers into the field.

Is naturalness a good guiding force in fundamental research?

Yes. We have plenty of examples where naturalness, in the sense of a quadratic sensitivity to an unknown ultraviolet scale, leads to postulating a new particle: the energy of the electron field (leading to the positron), the charged and neutral pion mass difference (leading to the rho-meson) or the kaon transition rates and mixing, which led to the postulation of the existence of the charm quark in 1970, before its direct discovery in 1974 at SLAC and Brookhaven. In everyday life we constantly assume naturalness, so yes, it is puzzling that the Higgs mass appears to be fine-tuned. Certainly, there is a lot we still don’t understand here, but I would not give up on naturalness, at least not that easily. In the case of the electroweak naturalness problem, it is clear that any solution, such as supersymmetry or compositeness, will also leave an imprint in the Higgs couplings. So the LHC can, in principle, tell us about naturalness even if we do not discover new physics directly; we just have to measure very precisely if the Higgs boson couplings align on a straight line in the mass-versus-coupling plane.

The presence of dark matter is overwhelming in the universe and it is embarrassing that we know little to nothing about its nature

Which collider should follow the LHC?

That is the billion-dollar question – I mean, the 25 billion-dollar question! To me one should go for the machines that explore as much as possible the new energy frontier, namely a 100 TeV hadron collider. It is a compromise between what we might be able to achieve from a machine-building/accelerator/engineering point of view and really exploring a new frontier. For instance, at a 100 TeV machine one can measure the Higgs self-coupling, which is intimately connected with the Higgs potential and to the profound question of the stability of the vacuum.

Which open question would you most like to see answered during your career?

Probably the nature of dark matter. The presence of dark matter is overwhelming in the universe and it is embarrassing that we know little to nothing about its nature and properties. There are many exciting possibilities, ranging from the lightest neutral states in new-physics models to a non-particle-like interpretation, like black holes. Either way, an answer to this question would be an incredible breakthrough.

bright-rec iop pub iop-science physcis connect