Despite several COVID waves, the organisers of the 6th edition of the International Summer School on Intelligent Signal Processing for Frontier Research and Industry (INFIERI) made this school an in-person event. The INFIERI school was successfully held at UAM from August 23 to September 4 thanks to the unprecedented speed of the vaccine roll out, the responsible behaviour of the school participants and the proper applied logistics.
Against a backdrop of topics ranging from cosmology to the human body and particle physics, the programme covered advanced technologies such as semiconductors, deep sub-micron 3D technologies, data transmission, artificial intelligence and quantum computing.
Topics were presented in lectures and keynote speeches, and the teaching was reinforced via hands-on laboratory sessions, allowing students to practise applications in realistic conditions across a range of areas, such as: theoretical physics, accelerators, quantum communication, Si Photonics and nanotechnology. The latter included medical applications to new mRNA vaccines, which have long been under investigation for cancer treatment, besides their use against COVID-19. For instance, they could analyse combined real PET/MRI images using machine-learning techniques to find biomarkers of illness in a hospital setting, or study the irradiation of a biomaterial using a proton beam. Worldwide experts from academia, industry and laboratories such as CERN either gave lectures or ran lab sessions, most of them attending in person, often for the entire duration of the school.
During the last day, the students presented posters on their own research projects – the high number and quality of presentations reflecting the cross-disciplinary facets and the excellence of the participants. Many were then selected to be part of the in-preparation proceedings of the Journal of Instrumentation.
The next INFIERI school will only offer in-person attendance, which is considered essential to the series, but if the pandemic continues it will exploit some of the learning gained from the 6th edition.
Despite two decades of extensive studies, the production of antinuclei in heavy-ion collisions is not yet fully understood. Antinuclei production is usually modelled by two conceptually different theoretical models, the statistical hadronisation model (SHM) and coalescence models. In the SHM, deuteron antinuclei are produced from a locally thermally equilibrated source, while antinuclei are formed from the binding of constituent nucleons, which are close in momentum and position phase space in the coalescence model. Both models predict very similar production yields of, for example, deuteron antinuclei, bound states of an antiproton and an antineutron. This calls for new experimental observables that discern different production models.
Measuring higher moments of the multiplicity distribution of antinuclei as well as the correlation with antinucleons produced in the collision have been recently proposed as sensitive variables to antinucleosynthesis processes in heavy-ion collisions. The first measurement of the variance to mean ratio of the multiplicity distribution of antideuterons is compared to the predictions of the SHM and coalescence models (figure 1). The coalescence model fails to describe the observed ratio of the variance and mean of the multiplicity distribution of antideuterons. The measurements are consistent with the statistical baseline, a Poissonian distribution, as well as with the SHM in the presence of baryon number conservation. However, this observable proves insensitive to the size of the correlation volume used in the SHM to conserve the baryon number.
The Pearson correlation coefficient between the number of produced antideuterons and antiprotons constrains the latter effectively. The small negative correlation reflects that there are less protons observed in events with at least one deuteron than in an average event (figure 1). The coalescence model does not reproduce the measurement, whereas it is possible to fit the measurement to extract the correlation volume out of the SHM. The obtained correlation volume is 1.6 times the volume of the fireball per unit of rapidity, which is smaller compared to those describing proton yields and a similar measurement of net-proton number fluctuations. These findings point to a later formation of the correlation among protons and deuterons compared to that among antiprotons and protons.
Overall, these results present a severe challenge to the current understanding of antinuclei production in heavy-ion collisions at the LHC energies. With the LHC Run 3 data it will be possible to extend these measurements to heavier antinuclei and to higher order correlation coefficients and moments of the antinuclei multiplicity distribution that are even more sensitive to details of the nucleosynthesis process in heavy-ion collisions.
Stuart Raby has written a modern, comprehensive textbook on quantum field theory, the Standard Model (SM) and its possible extensions. The focus of the book is on symmetries, and it contains a wealth of recent experimental results on Higgs and neutrino physics, which sets it apart from other textbooks addressing the same audience. It is published at a time when the incredible success story of the SM has come to a close with the discovery of the Higgs boson, and when the upcoming neutrino experiments promise to probe beyond-the-SM physics.
Raby is the author of some of the most important papers on supersymmetric grand unified theories and the book reflects that. It is no easy task to cover such a wide range of topics, from the basics of group theory to very advanced concepts such as gauge and gravity-mediated supersymmetry breaking, in one book. Raby devotes 120 pages to the basics of group theory, representations of the Poincaré group and the construction of the S matrix to provide the necessary foundations for the introduction to quantum electrodynamics in part III. Parts IV–VI introduce the reader to discrete symmetries, flavour symmetries and spontaneous symmetry breaking. Next, Raby describes two “Roads to the Standard Model” following the development of quantum chromodynamics and of electroweak theory, before arriving at the SM in part IX. The remaining parts deal with neutrino physics, grand unified theories and the minimal supersymmetric SM.
There are no omissions topic-wise, which makes the book very comprehensive. This comes at a price, however. In several places, complicated topics are discussed with only the most minimal of context, reading like a collection of equations rather than a textbook. Two examples of this are the discussion of causality for fermionic fields or the step from global to local supersymmetry, to which the author devotes only half a page each. In other places, more cross-referencing would improve legibility. For example, the chapter on SU(5) grand unified theory does not mention the automatic cancellation of gauge anomalies, a topic previously introduced in the context of the SM.
The use of materials is very distinctive. I doubt there is another book on the market that presents the reader with such a wealth of plots, figures and sketches, including recent experimental results on all the important topics discussed. The most important plots are reproduced in 12 pages of colour tables in the centre. There are exercises for the first five parts and a single Mathematica notebook is printed for Wigner rotations. Another distinguishing feature are the detailed suggested projects to use during a two-term course based on the book.
A very useful resource for designing a lecture of quantum field theory and beyond-the-SM physics
Although advertised as useful for both theorists and experimentalists, it is undeniably a book written from a theorist’s perspective. This becomes most clear in the latter parts, where relevant sections of the plots presenting experimental results remain unexplained. That being said, other very important experimental topics are explained, which you will not find in other textbooks about the SM. Raby explains how the anomalous magnetic moments of the electron and the muon are measured, and goes into quite some detail on neutrino experiments.
The book would benefit from improved editing. For example, the units are sometimes in italics, sometimes not, some equations are double tagged, some plots do not have axes labels, and there is inconsistent use of wavy and curly lines in the Feynman diagrams. Raby does make good use of references though, and points the reader to other textbooks and original literature; although the index needs to be extended significantly to be useful.
I recommend this book for advanced undergraduates, graduate students and lecturers. It provides a very useful resource for designing a lecture of quantum field theory and beyond-the-SM physics, and the amount of material covered is impressive and comprehensive. Beginners might be overwhelmed by Raby’s compact style , so I would recommend those who are new to quantum field theory to read a more accessible textbook in parallel.
This book by CERN’s Archana Sharma and her two students Robin Mathews and Ben Richardson merges the classic A-to-Z formula with CERN concepts, making it suitable for all audiences. Each letter is divided into four categories: physics, accelerator, computing and experiments, allowing the reader to get a good understanding of each area.
All concepts are described in a simple and understandable way, such as antimatter being the same particles of matter with opposite charge. More complex concepts are explained with fun facts to help the reader: the temperature of the quark–gluon plasma is 100,000 times hotter than the centre of the Sun, and the time it takes to record a video call of 1 exabyte is 237,823 years. Each description is accompanied by a photograph, logo or simulation representing the described concept, which makes the book visually attractive for the reader.
Born at the start of the global pandemic, the A-to-Z of CERN arose from the need to tell science and technology stories at CERN when internships and summer lectures were either limited or cancelled. Overall, it provides an informative and entertaining glossary of CERN and particle physics in general, peppered with some general physics and technology concepts, such as the SI-unit system and even some non-CERN experiments, such as the former ZEUS experiment at DESY.
Marie-Noëlle Minard passed away on 15 May 2022. She began her career as a physicist in 1969, with a postgraduate thesis at the Institute of Nuclear Physics (IPN) in Orsay under the direction of Louis Massonnet, on the subject of high-energy neutron detectors. She joined the CNRS as a research associate, while still at the IPN, in 1972 and began her PhD studies exploring ways to detect exotic particles under the supervision of Michel Yvert. Minard defended her thesis in 1976 and joined the newly created Annecy particle-physics laboratory (LAPP). She then spent two years at SLAC, where she worked on the rapid-cycling bubble chamber. Back at LAPP in 1979, she joined the group of physicists involved in the UA1 collaboration at the CERN SppS proton–antiproton collider. The group participated in the construction of the electromagnetic calorimeter, analysis tools, data taking and physics analyses.
With her colleagues at LAPP and CERN, Marie-Noëlle created an analysis to search for Z bosons, exploiting the UA1 data extracted online by the 168E emulators – the so-called “express line”. It was by analysing these events that Marie-Noëlle spotted the first Z boson on the night of 4 May 1983 – a source of immense pleasure in her career.
In 1987 Marie-Noëlle turned to LEP physics. She created, with Daniel Décamp, the ALEPH group at LAPP. This idea came up against obstacles: there was already an L3 group, and the rule at the time was that each IN2P3 laboratory could only participate in one experiment at LEP (with one exception). This occasion demonstrated the measure of Marie-Noëlle’s determination: when she was convinced of the merits of a project, her enthusiasm and energy were such that she was able to convince even the most reluctant. She finally obtained the green light for an ALEPH team at LAPP, which, under her direction, made many contributions to the experiment. She herself was run coordinator, responsible for calibration, and a pillar of the di-fermion analysis group (measuring the Z lineshape).
In the early 2000s Jacques Lefrançois invited Marie-Noëlle to join the LHCb collaboration. The team at LAPP, under her direction, made a major contribution to the experiment, particularly in the construction and operation of calorimetric systems as well as in numerous physics analyses. Project manager of the calorimeter group during its start-up between 2008 and 2011, then assistant to the project manager until 2013, Marie-Noëlle participated in the commissioning and definition of control and calibration procedures for the calorimeters and ensured the continuous monitoring of the gains and of the aging of the cells of the electromagnetic calorimeter throughout the first period of data taking (2011–2013).
Between 2000 and 2006 Marie-Noëlle was deputy director of LAPP, during which she strongly contributed to the definition and implementation of its scientific strategy. Very careful to communicate our science to the public, her creativity enabled her to organise several original and appreciated events. Marie-Noëlle supervised nine theses. For services rendered to research, she received one of the highest awards in France (de chevalier de la Légion d’honneur).
She was certainly demanding, much more of herself than of others, but always convinced that in a group everyone makes a positive contribution. A physicist and communicator of immense talent, she was above all a woman of limitless generosity, with a sometimes caustic sense of humour. She was brave and couldn’t stand injustice, often expressing aloud what others were quietly thinking. Marie-Noëlle loved swimming, sailing, cooking, reading and welcoming her many friends and family to her table. Those who, like us, have had the chance to work with her will miss her boundless commitment, the relevance of her advice and her humanity. We are thinking of Claude, her husband of 50 years, and of her large family.
To explain the large matter–antimatter asymmetry in the universe, the laws of nature need to be asymmetric under a combination of charge-conjugation (C) and parity (P) transformations. The Standard Model (SM) provides a mechanism for CP violation, but it is insufficient to explain the observed baryon asymmetry in the universe. Thus, searching for new sources of CP violation is important.
The non-invariance of the fundamental forces under CP can lead to different rates between a particle and an antiparticle decay. The CP violation in the decay of a particle is quantified through the parameter ACP, equal to the relative difference between the decay rate of the process and the decay rate of the CP-conjugated process. Three years ago, the LHCb collaboration reported the first observation of CP violation in the decay of charmed hadrons by measuring the difference between the time-integrated ACP in D0 →K–K+ and D0 →π– π+ decays, ΔACP. This difference was found to lie at the upper end of the SM expectation, prompting renewed interest in the charm-physics community. There is now an ongoing effort to understand whether this signal is consistent with the SM or a sign of new physics.
At the 41st ICHEP conference in Bologna on 7 July, the LHCb collaboration announced a new measurement of the individual time-integrated CP asymmetry in the D0 →K–K+ decay using the data sample collected during LHC Run 2. The measured value, ACP(K–K+)=[6.8±5.4(stat)±1.6(syst)]×10–4, is almost three times more precise than the previous LHCb determination obtained with Run 1 data. This was thanks not only to a larger data sample but also the inclusion of additional control channels Ds+ →K– K+ π+ and Ds+ →Ks0 K+. Together with the previous control channels, D+ →K– π+ π+ and D+ →Ks0 π+, these decays allow the separation between tiny signals of CP asymmetries from the much larger bias due to the asymmetric meson production and instrumental effects.
The combination of the measured values with the previously obtained ones of ACP(K–K+) and ΔACP by LHCb allowed the determination of the direct CP asymmetries in the D0 →π– π+ and D0 →K– K+ decays: [23.2±6.1]×10–4 and [7.7±5.7]×10–4, respectively, with correlated uncertainties (ρ=0.88). This is the first evidence of direct CP violation in an individual charm–hadron decay (D0→π– π+), with a significance of 3.8σ.
The sum of the two direct asymmetries, which is expected to be equal to 0 in the limit of s–d quark symmetry (called U-spin symmetry), is equal to [30.8±11.4]×10–4. This corresponds to a departure from U-spin symmetry of 2.7σ. In addition, this result is essential to the theory community in the quest to clarify the theoretical picture of CP-violation in the charm system. Since the measurement is statistically limited, its precision will improve with the larger dataset collected during Run 3.
The cryogenic infrastructure of the Large Hadron Collider (LHC) at CERN is the most complex helium refrigeration system of all the world’s research facilities.
The operation of the LHC’s cryogenic system was initiated in 2008 after reception testing and a first cool down to 1.9 K. This webinar will cover information on the design, operational experiences and main challenges linked to the accelerator, along with the physics requirements.
During the first stage, the operation team had to learn about the responsivity and limitations of the system. They then had to manage stable operation by maintaining the necessary conditions for the superconducting magnets, RF cavities, electrical feed boxes, power links and detector devices, thus contributing to the physics programme and the discovery of the Higgs boson in 2012.
One of the most challenging parameters impacting the cryogenics was the beam-induced heat load that was taken up, beginning during the second operation period (Run 2) of the LHC in 2015 with increased beam parameters. A complicated optimisation of the configuration of the cryogenic system was successfully applied to cope with these requirements.
Run 3 (preparation for which started in 2020) required the handling of several hundred magnet training quenches towards the nominal beam energy for physics production.
Now, after several years of operational experience with steady state and transient handling, the cryogenic system is being optimised to provide the necessary refrigeration, whilst incorporating the all-important aspect of energy preservation.
In conclusion, there will be a brief discussion of the next four years of operation.
Krzysztof Brodzinski is a senior staff member in the cryogenics group at the technology department at CERN. He is a mechanical engineer with a specialisation in refrigeration equipment, and graduated from Cracow University of Technology in Poland. Krzysztof joined the LHC cryogenic design team in 2001, has been a member of the cryogenic operation team since 2009 and in 2019 was mandated as a section leader of the cryogenic operation team for the LHC, ATLAS and CMS. He is also involved in the engineering of the cryogenic system for the HiLumi LHC RF deflecting cavities project, as well as participating in the ongoing FCC cryogenics study.
Why not watch our other related Higgs boson anniversary webinars?
It was March 1977 when the hypothetical Higgs boson first made its way onto the pages of this magazine. Reporting on a talk by Steven Weinberg at the Chicago Meeting of the American Physical Society, the editors noted the dramatic success of gauge theories in explaining recent discoveries at the time — beginning with the observation of the neutral current at CERN in 1973 and the “new physics” following the J/ψ discovery at Brookhaven and Stanford the following year, observing: “The theories also postulate a set of scalar particles in a similar mass range… If Higgs bosons exist, they will affect particle behaviour at all energies. However, their postulated interactions are even weaker than the normal weak interactions. The effects would only be observable on a very small scale and would usually be drowned out by the stronger interactions.”
The topic clearly drew the attention of readers, as just a few issues later, in September 1977, the editors delved deeper into the origins of the Higgs boson and its role in spontaneous symmetry breaking, offering Abdus Salam’s “personal picture” to communicate this abstruse concept: “Imagine a banquet where guests sit at round tables. A bird’s eye view of the scene presents total symmetry, with serviettes alternating with people around each table. A person could equally well take a serviette from his right or from his left. The symmetry is spontaneously broken when one guest decides to pick up from his left and everyone else follows suit.”
Within a year, CERNCourier was on the trail of how the Higgs boson might show itself experimentally. Reporting on a “Workshop on Producing High Luminosity Proton–Antiproton Storage Rings” held at Berkeley, the April 1978 issue stated: “As well as the intermediate boson, the proton–antiproton colliders could give the first signs of the Higgs particles or of other unexpected states. While the discovery of weak neutral currents and charm provided impressive evidence for the gauge-theory picture that unifies electromagnetic and weak interactions, one prediction of this picture is the existence of spinless Higgs bosons. If these are not found at higher energies, some re-thinking might be required.” In the December 1978 issue, with apologies to Neil Armstrong, the Courier ran a piece titled “A giant LEP for mankind”. The hope was that with LEP, physicists had the tool to explore in depth the details of the symmetry breaking mechanism at the heart of weak interaction dynamics.
The award of the 1979 Nobel Prize in Physics to Weinberg, Glashow and Salam for the electroweak theory received full coverage in December that year, with the Courier expressing confidence in the Higgs: “Another vital ingredient of the theory which remains to be tested are the Higgs particles of the spontaneous symmetry breaking mechanism. Here the theory is still in a volatile state and no firm predictions are possible. But this mechanism is crucial to the theory, and something has to turn up.”
A Higgs for the masses
To many people, wrote US theorist Sam Treiman in November 1981, the Higgs particle looks somewhat artificial — “a kind of provisional stand-in for deeper effects at a more fundamental level”. Four years later, “with several experiments embarking on fresh Higgs searches”, Richard Dalitz and Louis Lyons organised a neatly titled workshop “Higgs for the masses” to review the theoretical and experimental status. Another oddity of the Higgs, wrote Lyons, is that unless it is very light (less than 10–17 eV), the Higgs should make the universe curved, “contributing more to the cosmological constant than the known limit permits”. Lower limits (from spontaneous symmetry breaking) and higher limits (from the unitarity requirement) open up a wide range of masses for the Higgs to manoeuvre — between 7 and 1000 GeV, he noted. “From time to time, new ‘bumps’ and effects are tentatively put forward as candidate Higgs, but so far none are convincing.”
LEP’s electroweak adventure reached a dramatic climax in the summer of 2000, with hints that a light Higgs boson was showing itself. In October, the machine was granted a stay of Higgs execution. Alas, the signal faded, and the final curtain fell on LEP in November — a “LEPilogue” heralding the beginning of a new era: the LHC.
Discussions about a high-energy hadron collider were ongoing long before: ICFA’s Future Perspectives meeting at Brookhaven in October 1987 noted two major hadron collider projects on the market: “the US Superconducting Supercollider, with collision energies of 40 TeV in an 84 kilometre ring, and the CERN Large Hadron Collider, with up to 17 TeV collision energies”. In December 1994, shortly after CERN turned 40, Council provided the lab with “The ultimate birthday present“: the unanimous approval of the LHC. A quarter of a century later, the LHC started up and brought particle physics to the world.
Together with LEP data, Fermilab’s CDF and DØ experiments and the LHC 2011 measurement campaign narrowed down the possible mass range for the Higgs boson to be between 115 and 127 GeV. First tantalising hints of the Higgs boson were presented on 13 December 2011. The quest remained open for another half a year, until Director-General Rolf Heuer, following the famous talks by ATLAS and CMS spokespersons Fabiola Gianotti and Joe Incandela, concluded: “As a layman I would say: I think we have it” on 4 July 2012. It was a day to remember: a breakthrough discovery rooted in decades of work by thousands of individuals that rocked the CERN auditorium and reverberated around the world. A new chapter in particle physics had begun…
To mark the 10th anniversary of this momentous event, from Monday 4 July the Courier will be exploring the theoretical and experimental effort behind the Higgs-boson discovery, the immense progress made by ATLAS and CMS in our understanding of this enigmatic particle, and the deep connections between the Higgs boson and some of the most profound open questions in fundamental physics.
Wherever the Higgs boson leads, CERN Courierwill be there to report!
Ten years ago, a few small bumps in ATLAS and CMS data confirmed a 48 year-old theoretical prediction, and particle physics hasn’t been the same since. Behind those sigmas was the hard work, dedication, competence and team spirit of thousands of experimentalists and accelerator physicists worldwide. Naturally it was a triumph for theory, too. Peter Higgs, François Englert, Carl Hagen and Gerald Guralnik received a standing ovation in the CERN auditorium on 4 July 2012, although Higgs insisted it was a day to celebrate experiment, not theory. The Nobel prize for Englert and Higgs came a year later. Straying from tradition for elementary-particle discoveries, the citation explicitly acknowledged the experimental effort of ATLAS and CMS, the LHC and CERN.
The implications of the Higgs-boson discovery are still being understood. Ten years of precision measurements have shown the particle to be consistent with the minimal version required by the Standard Model. Combined with the no-show of non-Standard Model particles that were expected to accompany the Higgs, theorists are left scratching their heads. As we celebrate the collective effort of high-energy physicists in discovering the Higgs boson and determining its properties, another intriguing journey has opened up.
Marvelously mysterious
As “a fragment of vacuum” with the barest of quantum numbers, the Higgs boson is potentially connected to many open questions in fundamental physics. The field from which it hails governs the nature of the electroweak phase transition in the early universe, which might be connected with the observed matter–antimatter asymmetry; as the only known elementary scalar particle, it could serve as a portal to other, hidden sectors relevant to dark matter; its couplings to matter particles — representing a new interaction in nature — may hold clues to the puzzling hierarchy of fermion masses; and its interactions with itself have implications for the ultimate stability of the universe.
Nobody knows what the Higgs boson has in store
With the LHC and its high-luminosity upgrade, physicists have 20 years of Higgs exploration to look forward to. But to fully understand the shape of the Brout–Englert–Higgs potential, the couplings of the Higgs boson to Standard Model particles and its possible connections to new physics, a successor collider will be needed. It is fascinating to picture future generations of particle physicists working as one with astroparticle physicists, cosmologists, quantum technologists and others to fill out the details of this potential new vista, with colliders driving progress alongside astrophysical, cosmological and gravitational-wave observatories. Future colliders aren’t just about generating knowledge, argues Anna Panagopoulou of the European Commission, but are “moonshots” delivering a competitive edge in technology, innovation, education and training — opening adventures that inspire young people to enter science in the first place.
Nobody knows what the Higgs boson has in store. Perhaps further studies will confirm the scenario of a Standard-Model Higgs and nothing else. The sheer number and profundity of known unknowns in the universe would suggest otherwise, think theorists. The good news is that, in the Higgs boson, physicists have clear measurement targets – and in principle the necessary theoretical and experimental machinery – to explore such mysteries, building upon the events of 4 July 2012 to reach the next level of understanding in fundamental physics.
The search for the Higgs boson is the kind of adventure that draws many young people to science, even if they go on to work in more applied areas. I first set out to become a nuclear physicist, and even applied for a position at CERN, before deciding to specialise in electrical engineering and then moving into science policy. Today, my job at the European Commission (EC) is to co-create policies with member states and stakeholders to shape a globally competitive European research and innovation system.
Large research infrastructures (RIs) such as CERN have a key role to play here. Having visited CERN for the first time last year, I was impressed not just by the basic research but also by the services that CERN provides the collaborations, its relationships with industry, and its work in training and educating young people. It is truly an example of what it means to collaborate on an international level, and it helped me understand better the role of RIs in research and innovation.
Innovation is one of three pillars of the EC’s €95.5 billion Horizon Europe programme for the period 2021–2027. The first pillar is basic science, and the second concerns applied research and knowledge diffusion. Much of the programme’s focus is “missions” geared to societal challenges such as soil, climate and cancer, driven by the UN’s 2030 Sustainable Development Goals. So where does a laboratory like CERN fit in? Pillar one is the natural home of particle physics, where there is well established support via European Research Council grants, Marie Skłodowska-Curie fellowships and RI funding. On the other hand, the success of the Horizon Europe missions relies on the knowledge and new technologies generated by the RIs.
We view the role of RIs as driving knowledge and technology, and ensuring it is transferred in Europe – acting as engines in a local ecosystem involving other laboratories and institutes, hospitals and schools, attracting the best people and generating new labour forces. COVID-19 is a huge social challenge that we also managed to address using basic research, RIs and opening access to data. This is a clear socioeconomic impact of current research and also data collected in the past.
Open science is a backbone of Horizon Europe, and an area where particle physics and CERN in particular are well advanced. I chair the governance board of the European Open Science Cloud, a multi-disciplinary environment where researchers can publish, find and re-use data, tools and services, in which CERN has a long-standing involvement.
Indeed, the EC has established a very strong collaboration with CERN across several areas. Recently we have been meeting to discuss the proposed Future Circular Collider (FCC). The FCC is worthwhile not just to be discussed but supported, and we are already doing so via significant projects. We are now discussing possibilities in Horizon Europe to support more technological aspects, but clearly EU money is not enough. We need commitment from member states, so there needs to be a political decision. And to achieve that we need a very good business plan that turns the long-term FCC vision into clearly defined short-term goals and demonstrates its stability and sustainability.
Societal impact
Long-term projects are not new to the EC: we have ITER, for example, while even the neutrality targets for the green-deal and climate missions are for 2050. The key is to demonstrate their relevance. There is sometimes a perception that people doing basic research are closed in their bubble and don’t realise what’s going on in the “real” world. The space programme has managed to demonstrate over the years that there are sufficient applications providing value beyond its core purpose. Nowadays, with issues of defence, security and connectivity rising up political agendas, researchers can always bring to the table that their work can help society address its needs. For big RIs such as the FCC we need to demonstrate first: what is the added value, even if it’s not available today? Why is it important for Europe? And what is the business plan? The FCC is not a typical project. To attract and convince politicians and finance ministers of its merits, it has to be presented in terms of its uniqueness.
The FCC brings to mind the Moon landings
The FCC brings to mind the Moon landings. Contrary to popular depictions, this was a long-term project that built on decades of competitive research from different countries. Yes, it was a period during the Cold War, but it was also the basis of fruitful collaboration. If we don’t dare to spend money on projects that bring us to the future then we lose, as Europe, a competitive advantage.