Comsol -leaderboard other pages

Topics

ATLAS – A 25 Year Insider Story of the LHC Experiment

ATLAS A 25 Year Insider Story

ATLAS – A 25 Year Insider Story of the LHC Experiment is a comprehensive overview of one of the most complex and successful scientific endeavours ever undertaken. 117 authors collaborated to write on diverse aspects of the ATLAS project, ranging from the early days of the proto-collaboration, to the test-beam studies to verify detector concepts, the design, building and installation of the detector systems, building the event selection and computing environment required, forming the organisation, and finally summarising the harvest of physics gathered thus far. Some of the chapters cover topics that are discussed elsewhere – the description of the detector summarises more extensive journal publications, the major physics achievements have been covered in recent review articles and the organisational structure is discussed on the web – but this volume usefully brings these various aspects together in a single place with a unified treatment.

Despite the many authors who contributed to this book, the style and level of treatment is reasonably coherent. There are many figures and pictures that augment the text. Those showing detector elements that are now buried out of sight are important complements to the text descriptions: the pictures of circuit boards are less helpful, besides demonstrating that these electronics exist. A most engaging feature is the inclusion of one-page “stories” at the ends of the chapters, each giving some insight into the ups and downs of how the enterprise works. Among these vignettes we have such stories as the ATLAS collaboration week that almost no one attended and the spirit of camaraderie among the experimenters and accelerator operators at the daily 08:30 run meetings.

One could imagine several audiences for this book, and I suspect that, apart from ATLAS collaborators themselves, each reader will find different chapters most suited to their interests. The 26-page chapter “The ATLAS Detector Today” offers a more accessible overview for students just joining the collaboration than the 300-page journal publication referenced in most ATLAS publications. Similarly, “Towards the High-Luminosity LHC” gives a helpful brief introduction to the planned upgrades. “Building up the Collaboration” will be useful to historians of science seeking to understand how scientists, institutions and funding agencies engage in a project whose time is ripe. Those interested in project management will find “Detector Construction Around the World” illuminating: this chapter shows how the design and fabrication of detector subsystems is organised with several, often geographically disparate, institutions joining together, each contributing according to its unique talents. “From the LoI to the Detector Construction” and “Installation of the Detectors and Technical Coordination” will appeal to engineers and technical managers. The chapters “Towards the ATLAS Letter of Intent” and “From Test Beams to First Physics” catalogue the steps that were necessary to realise the collaboration and experiment, but whose details are primarily interesting to those who lived through those epochs. Finally, “Highlights of Physics Results (2010 – 2018)” could have offered an exciting story for non-scientists, and indeed the thrill of the chase for the Higgs boson comes through vividly, but with unexplained mentions of leptons, loops and quantum corrections, the treatment is at a more technical level than would be ideal for such readers, and the plots plucked from publications are not best suited to convey what was learned to non-physicists.

What makes a collaboration like this tick?

Given the words in the foreword that the book is “intended to provide an insider story covering all aspects of this global science project,” I looked forward to the final chapter, “ATLAS Collaboration: Life and its Place in Society”, to get a sense of the human dimensions of the collaboration. While some of that discussion is quite interesting – the collaboration’s demographics and the various outreach activities undertaken to engage the public – there is a missing element that I would have appreciated: what makes a collaboration like this tick? How did the large personalities involved manage to come to common decisions and compromises on the detector designs? How do physicists from nations and cultures that are at odds with each other on the world stage manage to work together constructively? How does one account for the distinct personalities that each large scientific collaboration acquires? Why does every eligible author sign all ATLAS papers, rather than just those who did the reported analysis? How does the politics for choosing the collaboration management work? Were there design choices that came to be regretted in the light of subsequent experience? In addition to the numerous successes, were there failures? Although I recognise that discussing these more intimate details runs counter to the spirit of such large collaborations, in which one seeks to damp out as much internal conflict as possible, examining some of them would have made for a more compelling book for the non-specialist.

The authors should be commended for writing a book unlike any other I know of. It brings together a factual account of all aspects of ATLAS’s first 25 years. Perhaps as time passes and the participants mellow, the companion story of the how, in addition to the what and where, will also be written.

Taking the temperature of collider calorimetry

Concise and accessible, Calorimetry for Collider Physics is a reference book worthy of the name. Well known experts Michele Livan and Richard Wigmans have written an up-to-date introduction to both the fundamental physics and the technical parameters that determine the performance of calorimeters. Students and senior experts alike will be inspired to deepen their study of the characteristics of these instruments – instruments that have become crucial to most contemporary experiments in particle physics.

Following a light and attractive introductory chapter, the reader is invited to refresh his or her knowledge of the interactions of particles with matter. Key topics such as shower development, containment and profile, linearity and energy resolution are discussed for both electromagnetic and hadronic components. The authors provide illustrations with test-beam results and detailed Monte Carlo simulations. Practical and numerical examples help the reader to understand even counterintuitive effects, stimulating critical thinking in detector designers, and helping the reader develop a feeling for the importance of the various parameters that affect calorimetry.

The authors do not shy away from criticising calorimetric approaches

An important part of the book is devoted to hadron calorimetry. The authors have made a remarkably strong impact in understanding the fundamental problems with large set-ups in test beams, for example the famous lead-fibre sampling spaghetti calorimeter SPACAL. Among other issues, they correct “myths” as to which processes really cause compensation, and discuss quantities that correlate to the invisible energy fraction from hadrons involved in the shower process, for example, to measure the electromagnetic shower fraction event-by-event. The topical development of the dual-readout calorimeter concept follows logically from there – a very promising future direction for this central detector component, as the book discusses in considerable detail. This technology would avoid the question of longitudinal segmentation, which has a particular impact on linearity and calibration.

Calorimetry Wigmans Livan

Livan and Wigmans’ book also gives a valuable historical overview of the field, and corrects several erroneous interpretations of past experimental results. The authors do not shy away from criticising calorimetric approaches in former, present and planned experiments, making the book “juicy” reading for experts. The reader will not be surprised that the authors are, for example, rather critical about highly segmented calorimeters aiming at particle flow approaches.

There is only limited discussion about other aspects of calorimetry, such as triggering, measuring jets and integrating calorimeters into an overall detector concept, which may impose many constraints on their mechanical construction. These aspects were obviously considered beyond the scope of the book, and indeed one cannot pack everything into a single compact textbook, though the authors do include a very handy appendix with tables of parameters relevant to calorimetry.

By addressing the fundamentals of calorimetry, Livan and Wigmans have provided an outstanding reference book. I recommend it highly to everybody interested in basic detector aspects of experimental physics. It is pleasant and stimulating to read, and if in addition it triggers critical thinking, so much the better!

When twistors met loops

Loop Quantum Gravity and Twistor Theory have a lot in common. They both have quantum gravity as a main objective, they both discard conventional spacetime as the cornerstone of physics, and they have both taken major inspiration from renowned British mathematician Roger Penrose. Interaction between the two communities has been minimal so far, however, due to their distinct research styles: mathematically oriented in Twistor Theory, but focused on empirical support in Loop Gravity. This separation was addressed in the first week of September at a conference held at the Centre for Mathematical Researches (CIRM) at the Campus of Luminy in Marseille, where about a hundred researchers converged for lively debates designed to encouraged cross-fertilisation between the two research lines.

Both Twistor Theory and Loop Gravity regard conventional smooth general-relativistic spacetime as an approximate and emerging notion. Twistor theory was proposed by Roger Penrose as a general geometric framework for physics, with the long-term aim of unifying general relativity and quantum mechanics. The main idea of the theory is to work on the null rays, namely the space of the possible path that a light ray can follow in spacetime, instead of the manifold of the points of physical spacetime. Spacetime points, or events, are then seen as derived objects: they are given by compact holomorphic curves in a complex three-fold: twistor space. It is remarkable how much the main equations of fundamental physics simplify when formulated in these terms. The mathematics of twistors has roots in the 19th century Klein correspondence in projective geometry, and modern Twistor Theory has had a strong impact on pure mathematics, from differential geometry and representation theory to gauge theories and integrable systems.

Could allying twistors and loops be dangerous?

Loop gravity, on the other hand, is a background-independent theory of quantum gravity. That is, it does not treat spacetime as the background on which physics happens, but rather as a dynamical entity itself satisfying quantum theory. The conventional smooth general relativistic spacetime emerges in the classical (ℏ→0) limit, in the same manner as a smooth electromagnetic field satisfying the Maxwell equations emerges from the Fock space of the photons in the classical limit of quantum electrodynamics. Similarly, the full dynamics of classical general relativity is recovered from the quantum dynamics of Loop gravity in the suitable limit. The transitions amplitudes of the theory are finite in the ultraviolet and are expressed as multiple integrals over non compact groups. The theory provides a compelling picture of quantum spacetime. A basis in the Hilbert space of the theory is described by the mathematics of the spin networks: graphs with links labelled by SU(2) irreducible representations, independently introduced by Roger Penrose in the early 1970s in an attempt to a fully discrete combinatorial picture of quantum physical space. Current applications of Loop Gravity include early cosmology, where the possibility of a bounce replacing the Big Bang has been extensively studied using Loop gravity methods, and black holes, where the theory’s amplitudes can be used to study the non-perturbative transition at the end of the Hawking evaporation.

The communities working in Twistors and Loops share technical tools and conceptual pillars, but have evolved independently for many years, with different methods and different intermediate goals.  But recent developments discussed at the Marseille conference saw twistors appearing in formulations of the loop gravity amplitudes, confirming the fertility and the versatility of the twistor idea, and raising intriguing questions about possible deeper relations between the two theories.

The conference was a remarkable success. It is not easy to communicate across research programs in contemporary fundamental physics, because a good part of the field is stalled in communities blocked by conflicting assumptions, ingrained prejudices and seldom questioned judgments, making understanding one another difficult. The vibrant atmosphere of the Marseille conference cut through this.

The best moment came during Roger Penrose’s talk. Towards the end of a long and dense presentation of new ideas towards understanding the full space of the solutions of Einstein’s theory using twistors, Roger said rather dramatically that now he was going to present a new big idea that might lead to the twistor version of the full Einstein equations  – but at that precise moment the slide projector exploded in a cloud of smoke, with sparks flying. We all thought for a moment that a secret power of the Universe, worried about being unmasked, had interfered. Could allying twistors and loops be dangerous?

Particle physicists challenge EC rebranding

An open letter addressed to the presidents of the European Parliament and the European Commission (EC) demanding better recognition for education and research has closed, having attracted around 13,600 signatories during the past two months.

Published on 17 September by a group of eight prominent particle physicists in Europe – Siegfried Bethke (MPI for Physics), Nora Brambilla (TU-München), Aldo Deandrea (U-Lyon 1), Carlo Guaraldo (INFN Frascati), Luciano Maiani (U-Roma La Sapienza), Antonio Pich (U-València), Alexander Rothkopf (U-Stavanger) and Johanna Stachel (U-Heidelberg) – the letter followed the announcement of a new EC organisational structure on 10 September in which former EC directorates for education, culture, sports and youth, as well as that for research, science and innovation, have been subsumed under a single commissioner with the titular brief “innovation and youth”.

“Words are important,” says Maiani, who was CERN Director-General from 1999–2003. “Omitting ‘research’ from the logo of the EC is reason for concern. The response we received, including from prestigious personalities, reassured us that this concern is widely shared.”

With signatories including hundreds of university and laboratory leaders, 19 Nobel laureates and many institutions including the European, French and German physical societies, the letter demands that the EC revises the title of the brief to “Education, Research, Innovation and Youth”. It states: “We, as members of the scientific community of Europe, wish to address this situation early on and emphasise both to the general public, as well as to relevant politicians on the national and European Union level, that without dedication to education and research there will neither exist a sound basis for innovation in Europe, nor can we fulfill the promise of a high standard of living for the citizens of Europe in a fierce global competition.”

Of course we are disappointed that the voices of more than 13,600 scientists went unheard

Johanna Stachel, University of Heidelberg

The letter closed on 13 November after the EC issued a press release stating that it will rename three commissioner portfolios, but that the title of commissioner designate for innovation and youth, Mariya Gabriel, is not among those three being changed. “Naturally we are disappointed, and even frightened as the decisions about non-renaming prove that the omission of research and education in the title signals how low these fields may be valued by the new commission,” says Bethke.

“We will keep pushing,” adds Stachel. “But of course we are disappointed that the voices of more than 13,600 scientists went unheard, despite many prominent voices and also significant press coverage.”

On 18 November, Rothkopf responded to European parliament president David Sassoli on behalf of the initial signatories with a letter, stating: “It is with great disappointment that we recognize that the voice of science has not reached the ears of the Commission. The intention of the Commission is to stimulate innovation. But we reiterate with force that without research and education there is no future to innovation… We are counting on you, Mr. President, to represent the voice of all European citizens who have signed up as supporter of the open letter and in the interest of European research.”

Update 28th November: Speaking at a plenary session of the European Parliament in Strasbourg on 27 November, European Commission (EC) president-elect Ursula von der Leyen announced that the brief of commissioner Mariya Gabriel would be been renamed “Innovation, research, culture, education and youth”. The addition of “education” and “research” to the initial title of the brief announced on 10 September was met with applause in the chamber.

The Weil conjectures

The Weil Conjectures

“I am less interested in mathematics than in mathematicians,” wrote Simone Weil to her brother André, a world-class mathematician who was imprisoned in Rouen at the time. The same might be said about US novelist and onetime mathematics student Karen Olsson. Despite the title, her new book, The Weil Conjectures, stars the extraordinary siblings at the expense of André’s mathematical creation.

First conceived by André in prison, and finally proven three decades later by Pierre Deligne in 1974, the Weil conjectures are foundational pillars of algebraic geometry. Linking the continuous and the discrete, and the realms of topology and number theory, they are pertinent to efforts to unite gravity with the quantum theories of the other forces. Frustratingly, though, mathematical hobbyists hoping for insights into the conjectures will be disappointed by this book, which instead zeroes in on the people in orbit around the maths.

Olsson is particularly fascinated by Simone Weil. An iconoclastic public intellectual in France, and possessed by an intensely authentic humanity that the author presents as quite alien to André, Simone was nevertheless envious of her brother’s mathematical insight, writing that she “preferred to die than to live without that truth”. Olsson is clearly empathetic, and so, one would suspect, will be most readers in a profession where intellect is all. Whether one is a grad student or a foremost expert in the field, there is always someone smarter, whose insights seem inaccessible.

Physicists may also detect echoes of the current existential crisis in theoretical physics (see Redeeming the role of mathematics) in Simone’s thinking. While she feels that “unless one has exercised one’s mind at the gymnastics of mathematics, one is incapable of precise thought, which amounts to saying that one is good for nothing,” she criticises “the absolute dominion that is exercised over science by the most abstract forms of mathematics.”

Peppered with anecdotes about other mathematicians – Girolamo Cardano is described as a “total dick” – and more a succession of scenes than a biography, the book is as much about Olsson herself as the Weils. The prose zig-zags between vignettes from the author’s own life and the Weils without warning, leaving the reader to search for connections. Facts are unsourced, and readers are left to guess what is historical and what is the author’s impressionistic character portrait. Charming and quirky, the text transforms dusty perceptions of the meetings of the secret Bourbaki society of French mathematicians into scenes of lakeside debauchery and translucent camisoles that are almost reminiscent of Pride and Prejudice. Olsson even takes us into Simone’s dreams, with the conjectures only cropping up at the end of the book. If you limit your reading to the maths and the Weils, the resulting slim volume is a page turner.

Ernst-Wilhelm Otten 1934–2019

Ernst Otten

Ernst-Wilhelm Otten received his doctorate in 1962 at the University of Heidelberg under the supervision of atomic and nuclear physicist Hans Kopfermann. From 1972 until his retirement in 2002, he headed the department of experimental atomic and nuclear physics (EXAKT) at the University of Mainz. Ernst spent numerous research stays abroad, including at CERN and at the Ecole Normale Supérieure in Paris. After his retirement Ernst continued his research activities, especially for the KATRIN neutrino experiment. The hallmark of his work was the extraordinary breadth across almost all disciplines of physics, which earned him a large number of distinctions and prizes.

In Heidelberg, Ernst developed the method of optical pumping for polarising the nuclear spins of radioactive isotopes to determine their nuclear moments. He also recognised, from its start-up in the late 1960s, the opportunities offered by the on-line isotope separator ISOLDE at CERN. He became a pioneer of optical spectroscopy with accelerators. The discovery of unexpected nuclear-shape coexistence and nuclear-size changes in neutron-deficient mercury isotopes is one of the most outstanding results obtained at ISOLDE, as early as 1972. In Mainz, his group developed the high-resolution method of collinear laser spectroscopy – now a workhorse at ISOLDE for the determination of nuclear ground-state properties of short-lived nuclei – and with his collaborators initiated laser-based trace analysis for the detection of radionuclides in the environment.

The electron accelerators at Mainz enabled spectacular experiments: the test of parity non-conservation by neutral currents in polarised–electron nucleon scattering, and the determination of the neutron electric form factor using polarised 3He targets at high density. With hyperpolarised 3He gas, Ernst performed lung diagnostics by magnetic resonance imaging in collaboration with the German Cancer Research Centre in Heidelberg and the department of radiology at the University of Mainz.

In the 1980s, when a group reported a 30 eV mass of the antineutrino, Ernst developed a novel high-resolution beta spectrometer at Mainz to determine the neutrino mass very precisely from tritium decay. Together with his team he succeeded in setting an upper limit of 2 eV. After the discovery of neutrino oscillations in 1998, proving the existence of finite neutrino masses, Ernst initiated the KATRIN experiment at the Karlsruhe Institute of Technology to measure the neutrino mass. The construction of this technically extremely difficult spectrometer started in 2001, and Ernst was very actively involved until his death on 8 July. As such, he was able to witness the first successful result: the setting of a new upper limit on the neutrino mass of 1 eV (see KATRIN sets first limit on neutrino mass).

Ernst leaves deep traces in science and in the physics community. We will remember him as a great scientist, teacher, mentor and friend.

Irish politicians call for associate membership of CERN

A cross-party committee of the legislature of the Republic of Ireland has unanimously recommended joining CERN. In a report published last week, the Joint Committee on Business, Enterprise and Innovation recommended that negotiations to become an Associate Member State begin immediately. The report follows the country’s Innovation 2020 science strategy, published in 2015, which identified CERN as one of four international research bodies which Ireland would benefit from joining. Since then, Ireland has joined the other three organisations, namely the European Southern Observatory, the intergovernmental life-science collaboration ELIXIR, and the LOFAR network of radio-frequency telescopes.

Ireland is one of only three European countries that do not have any formal agreement with CERN, said committee chair Mary Butler. “Innovation 2020’s vision is for Ireland to be a global innovation leader driving a strong sustainable economy and a better society. If Ireland is to deliver on this vision, membership of organisations such as CERN, which are at the forefront of innovation, is critical.” CERN already enjoys a productive relationship with physicists in Ireland, with University College Dublin a longstanding member of the LHCb and CMS collaborations, Dublin City University working on ISOLDE, University College Cork contributing civil engineering expertise, and theorists from several institutions involved in CERN projects. In January 2016, Ireland notified CERN of its intention to initiate deliberations on potential associate membership.

“I welcome this report and endorse its recommendations, which are pragmatic and cost-effective,” says Ronan McNulty, leader of the LHCb group at University College Dublin, and witness to the committee. “Delivery of these recommendations would enormously improve Irish academic links to CERN and create a new landscape for training the next generation of scientists and engineers, as well as developing business opportunities in the technology sector, and beyond.”

Ireland has a strong particle-physics community and CERN would welcome stronger institutional links

Charlotte Warakaulle

The report envisages a “multiplier effect” for return on investment to the Irish economy as a result of joining CERN. Although around 20 Irish companies already have contracts with CERN, it notes that they are at a competitive disadvantage as the laboratory prioritises companies from member countries. Under associate membership, says the report, contracts with Irish companies could rise to one third of the country’s financial contribution to the laboratory, which for Associate Member States must be at least of 10% of the cost of full membership. The cost of full membership, which yields voting rights at the CERN Council and eliminates the investment cap, depends on a country’s GDP, and would currently be estimated to be of the order of €12.5 million per year in Ireland’s case.

“We note the positive report from the committee, which clearly sets out the opportunities that membership or associate membership of CERN would bring to Ireland,” said Charlotte Warakaulle, CERN’s director for international relations. “Ireland has a strong particle-physics community and CERN would welcome stronger institutional links, which we believe would be mutually beneficial.”

The Irish government will now consider the committee’s findings.

BASE tests antimatter’s dark side

A first-of-its-kind experiment at CERN has brought dark matter and antimatter face to face. The fundamental nature of dark matter, inferred to make up around a quarter of the universe, is unknown, as is the reason for the observed cosmic imbalance between matter and antimatter. Investigating potential links between the two, researchers working on the Baryon Antibaryon Symmetry Experiment (BASE) at CERN, in collaboration with members of the Helmholtz Institute at Mainz, have reported the first laboratory search for an interaction between antimatter and a dark-matter candidate: the axion.

Axions are extremely light, spinless bosons that were originally proposed in the 1970s to resolve the strong charge–parity problem of quantum chromodynamics and, later, were predicted by theories beyond the Standard Model. Being stable, axions produced during the Big Bang would still be present throughout the universe, possibly accounting for dark matter or some portion of it. In this case, Earth would experience a “wind” of gravitationally interacting dark-matter particles that would couple to matter and antimatter and periodically modulate their fundamental properties, such as their magnetic moment. However, no evidence of such an effect has so far been seen in laboratory experiments with ordinary matter, setting stringent limits on the microscopic properties of cosmic axion-like particles.

Our ALP–antiproton coupling limits are much more stringent than limits derived from astrophysical observations

Stefan Ulmer

The BASE team has now searched for the phenomenon in antimatter via measurements of the precession frequency of the antiproton’s magnetic moment, which it is able to determine with a fractional precision of 1.5×10-9. The technique relies on single-particle spin-transition spectroscopy – comparable to performing NMR with a single antiproton – whereby individual antiprotons stored in a Penning trap are spin-flipped from one state to another (CERN Courier March 2018 p25). An observed variation in the precession frequency over time could provide evidence for the nature of dark matter and, if antiprotons have a stronger coupling to these particles than protons do, such a matter–antimatter asymmetric coupling could provide a link between dark matter and the baryon asymmetry in the universe.

“We’ve interpreted these data in the framework of the axion wind model where light axion like particles (ALP’s) oscillate through the galaxy, at frequencies defined by the ALP mass,” explains lead author and BASE co-spokesperson Christian Smorra of RIKEN in Japan. “The particles couple to the spins of Standard Model particles, which would induce frequency modulations of the Larmor precession frequency.”

Accruing around 1000 measurements over a three-month period, the team determined a time-averaged frequency of the antiproton’s precession of around 80 MHz with an uncertainty of 120 mHz. No signs of regular variations were found, producing the first laboratory constraints on the existence of an interaction between antimatter and a dark-matter candidate. The BASE data constrain the axion-antiproton interaction parameter (a factor in the matrix element inversely proportional to the postulated coupling between axions and antiprotons) to be above 0.1 GeV for an axion mass of 2×10−23 and above 0.6 GeV for an axion mass of 4×10−17 eV, at 95% confidence. For comparison, similar experiments using matter instead of antimatter achieve limits of above 10 and 1000 TeV for the same mass range – demonstrating that a major violation of established charge-party-time symmetry would be implied by any signal given the current BASE sensitivity. The collaboration also derived limits on six combinations of previously unconstrained Lorentz- and CPT-violating coefficients of the non-minimal Standard Model extension.

“We have not observed any oscillatory signature, however, our ALP–antiproton coupling limits are much more stringent than limits derived from astrophysical observations,” says BASE spokesperson Stefan Ulmer of RIKEN, who is optimistic that BASE will be able to improve the sensitivity of its axion search. “Future studies, with a ten-fold improved frequency stability, longer experimental campaigns and broader spectral scans at higher frequency resolution, will allow us to increase the detection bandwidth.”

US proposal teases FCC-ee energy boost

"Green" FCC-ee with ERLs

Accelerator physicists in the US have proposed an alternative approach to the design of the proposed Future Circular electron-positron Collider (FCC-ee), generating lively discussions in the community on the eve of the update of the European strategy for particle physics. A 360-page long conceptual design report for the 100 km FCC-ee, a possible successor to the high-luminosity LHC at CERN, was published in January following a five-year study by the international FCC collaboration. A key consideration of the baseline design was to minimise energy consumption — a challenge addressed by the novel US proposal based on technology recently explored for future electron-ion and electron-proton colliders.

The modified acceleration scheme, laid out in a preprint published recently by Vladimir Litvinenko (Stony Brook) and Thomas Roser and Maria Chamizo-Llatas (Brookhaven National Laboratory), uses Energy Recovery Linacs (ERLs) to purportedly reduce synchrotron radiation by a factor of ten compared to the FCC-ee baseline design. “In addition to the potential power saving, the ERL version of the FCC-ee could extend the centre-of-mass energy reach up to 600 GeV while providing very high luminosities,” says Chamizo-Llatas. The maximum energy discussed in the conceptual design report for the FCC-ee baseline is 365 GeV, as required for top-antitop production.

First proposed by Maury Tigner in 1965, ERLs recoup the kinetic energy of particle bunches by manipulating their arrival time in the radio-frequency (RF) cavities. Previously accelerated bunches encounter a decelerating electric field, and the regained energy, stored once again in the cavity’s field, may be recycled to accelerate subsequent bunches. Though an old idea, ERLs are only now becoming feasible due to the high quality of modern superconducting RF cavities.

In June the Cornell–Brookhaven ERL Test Accelerator (CBETA) facility, which was envisaged as an ERL demonstrator for the Electron-Ion Collider (EIC) proposed in the US, achieved full energy recovery for a single pass. Prior to this, the concept was demonstrated at Jefferson Laboratory in the US and at Daresbury Laboratory in the UK. Further R&D with cavity technology compatible with FCC-ee proposal is planned for the Powerful Energy-Recovery Linac for Experiments (PERLE) project at Orsay, which was conceived as a test facility for electron-proton colliders.

The basic feasibility of the proposed concept must still be demonstrated

Frank Zimmermann

The US trio’s alternative FCC-ee proposal, which was inspired by past design work for the EIC, maintains high beam quality by decelerating the beams after every collision at one of the interaction points, and “cooling” them in dedicated rings. The use of ERLs allows the beams to be decelerated, cooled and re-accelerated with minimal energy expended, potentially yielding much lower emittances than found in conventional circular machines. “The electric power consumption of a future FCC-ee will be a limiting factor for luminosity and centre-of-mass energy,” says Roser. “During our design studies for the EIC we realised that using an ERL for the electrons could produce significantly more luminosity for a given electron beam current,” he explains, though the team admits that their concept would require extensive studies similar to what the FCC-ee design team did for the storage-ring design.

The BNL proposal is certainly tantalising, agrees FCC deputy study leader Frank Zimmermann of CERN. “Presently, Energy Recovery Linacs are a topic of great worldwide interest, with efforts ongoing, for example, at Cornell, Jefferson Lab, KEK, Mainz, and Orsay,” he says. “However, the basic feasibility of the proposed concept must still be demonstrated and the potentially high investment cost understood, before this approach could be considered as a highest-energy option for a future circular lepton collider.”

Gauge–gravity duality opens new horizons

What, in a nutshell, did you uncover in your famous 1997 work, which became the most cited in high-energy physics?

Juan Maldacena

The paper conjectured a relation between certain quantum field theories and gravity theories. The idea was that a strongly coupled quantum system can generate complex quantum states that have an equivalent description in terms of a gravity theory (or a string theory) in a higher dimensional space. The paper considered special theories that have lots of symmetries, including scale invariance, conformal invariance and supersymmetry, and the fact that those symmetries were present on both sides of the relationship was one of the pieces of evidence for the conjecture. The main argument relating the two descriptions involved objects that appear in string theory called D-branes, which are a type of soliton. Polchinski had previously given a very precise description for the dynamics of D-branes. At low energies a soliton can be described by its centre-of-mass position: if you have N solitons you will have N positions. With D-branes it is the same, except that when they coincide there is a non-Abelian SU(N) gauge symmetry that relates these positions. So this low-energy theory resembles the theory of quantum chromodynamics, except that with N colours and special matter content.

On the other hand, these D-brane solitons also have a gravitational description, found earlier by Horowitz and Strominger, in which they look like “black branes” – objects similar to black holes but extended along certain spatial directions. The conjecture was simply that these two descriptions should be equivalent. The gravitational description becomes simple when N and the effective coupling are very large.

Did you stumble across the duality, or had you set out to find it?

It was based on previous work on the connection between D-branes and black holes. The first major result in this direction was the computation of Strominger and Vafa, who considered an extremal black hole and compared it to a collection of D-branes. By computing the number of states into which these D-branes can be arranged, they found that it matched the Bekenstein–Hawking black-hole entropy given in terms of the area of the horizon. Such black holes have zero temperature. By slightly exciting these black holes some of us were attempting to extend such results to non-zero temperatures, which allowed us to probe the dynamics of those nearly extremal black holes. Some computations gave similar answers, sometimes exactly, sometimes up to coefficients. It was clear that there was a deep relation between the two, but it was unclear what the concrete relation was. The gravity–gauge (AdS/CFT) conjecture clarified the relationship.

Are you surprised by its lasting impact?

Yes. At the time I thought that it was going to be interesting for people thinking about quantum gravity and black holes. But the applications that people found to other areas of physics continue to surprise me. It is important for understanding quantum aspects of black holes. It was also useful for understanding very strongly coupled quantum theories. Most of our intuition for quantum field theory is for weakly coupled theories, but interesting new phenomena can arise at strong coupling. These examples of strongly coupled theories can be viewed as useful calculable toy models. The art lies in extracting the right lessons from them. Some of the lessons include possible bounds on transport, a bound on chaos, etc. These applications involved a great deal of ingenuity since one has to extract the right lessons from the examples we have in order to apply them to real-world systems.

What does the gravity–gauge duality tell us about nature, given that it relates two pictures (e.g. involving different dimensionalities of space) that have not yet been shown to correspond to the physical world?

It suggests that the quantum description of spacetime can be in terms of degrees of freedom that are not localised in space. It also says that black holes are consistent with quantum mechanics, when we look at them from the outside. More recently, it was understood that when we try to describe the black-hole interior, then we find surprises. What we encounter in the interior of a black hole seems to depend on what the black hole is entangled with. At first this looks inconsistent with quantum mechanics, since we cannot influence a system through entanglement. But it is not. Standard quantum mechanics applies to the black hole as seen from the outside. But to explore the interior you have to jump in, and you cannot tell the outside observer what you encountered inside.

One of the most interesting recent lessons is the important role that entanglement plays in constructing the geometry of spacetime. This is particularly important for the black-hole interior.

I suspect that with the advent of quantum computers, it will become increasingly possible to simulate these complex quantum systems that have some features similar to gravity. This will likely lead to more surprises.

In what sense does AdS/CFT allow us to discuss the interior of a black hole?

It gives us directly a view of a black hole from the outside, more precisely a view of the black hole from very far away. In principle, from this description we should be able to understand what goes on in the interior. While there has been some progress on understanding some aspects of the interior, a full understanding is still lacking. It is important to understand that there are lots of weird possibilities for black-hole interiors. Those we get from gravitational collapse are relatively simple, but there are solutions, such as the full two-sided Schwarzschild solution, where the interior is shared between two black holes that are very far away. The full Schwarzschild solution can therefore be viewed as two entangled black holes in a particular state called the thermofield double, a suggestion made by Werner Israel in the 1970s. The idea is that by entangling two black holes we can create a geometric connection through their interiors: the black holes can be very far away, but the distance through the interior could be very short. However, the geometry is time-dependent and signals cannot go from one side to the other. The geometry inside is like a collapsing wormhole that closes off before a signal can go through. In fact, this is a necessary condition for the interpretation of these geometries as entangled states, since we cannot send signals using entanglement. Susskind and myself have emphasised this connection via the “ER=EPR” slogan. This says that EPR correlations (or entanglement) should generally give rise to some sort of “geometric” connection, or Einstein–Rosen bridge, between the two systems. The Einstein–Rosen bridge is the geometric connection between two black holes present in the full Schwarzschild solution.

Are there potential implications of this relationship for intergalactic travel?

Gao, Jafferis and Wall have shown that an interesting new feature appears when one brings two entangled black holes close to each other. Now there can be a direct interaction between the two black holes and the thermofield double state can be close to the ground state of the combined system. In this case, the geometry changes and the wormhole becomes traversable.

One can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole

In fact, as shown by Milekhin, Popov and myself, one can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole. We could construct a controllable solution only for small black holes because we needed to approximate the fermions as being massless.

If one wanted a big macroscopic wormhole where a human could travel, then it would be possible with suitable assumptions about the dark sector. We’d need a dark U(1) gauge field and a very large number of massless fermions charged under U(1). In that case, a pair of magnetically charged black holes would enable one to travel between distant places. There is one catch: the time it would take to travel, as seen by somebody who stays outside the system, would be longer than the time it takes light to go between the two mouths of the wormhole. This is good, since we expect that causality should be respected. On the other hand, due to the large warping of the spacetime in the wormhole, the time the traveller experiences could be much shorter. So it seems similar to what would be experienced by an observer that accelerates to a very high velocity and then decelerates. Here, however, the force of gravity within the wormhole is doing the acceleration and deceleration. So, in theory, you can travel with no energy cost.

How does AdS/CFT relate to broader ideas in quantum information theory and holography?

Quantum information has been playing an important role in understanding how holography (or AdS/CFT) works. One important development is a formula, due to Ryu and Takayanagi, for the fine-grained entropy of gravitational systems, such as a black hole. It is well known that the area of the horizon gives the coarse-grained, or thermodynamic, entropy of a black hole. The fine-grained entropy, by contrast, is the actual entropy of the full quantum density matrix describing the system. Surprisingly, this entropy can also be computed in terms of the area of the surface. But it is not the horizon, it is typically a surface that lies in the interior and has a minimal area. 

If you could pick any experiment to be funded and built, what would it be?

Well, I would build a higher energy collider, of say 100 TeV, to understand better the nature of the Higgs potential and look for hints of new physics. As for smaller scale experiments, I am excited about the current prospects to manipulate quantum matter and create highly entangled states that would have some of the properties that black holes are supposed to have, such as being maximally chaotic and allowing the kind of traversable wormholes described earlier.

How close are we to a unified theory of nature’s interactions?

String theory gives us a framework that can describe all the known interactions. It does not give a unique prediction, and the accommodation of a small cosmological constant is possible thanks to the large number of configurations that the internal dimensions can acquire. This whole framework is based on Kaluza–Klein compactifications of 10D string theories. It is possible that a deeper understanding of quantum gravity for cosmological solutions will give rise to a probability measure on this large set of solutions that will allow us to make more concrete predictions.Matth

bright-rec iop pub iop-science physcis connect