For every trillion K0S, only five are expected to decay to two muons. Like the better known Bs → μ+μ– decay, which was first observed jointly by LHCb and CMS in 2013, the decay rate is very sensitive to possible contributions from yet-to-be discovered particles that are too heavy to be observed directly at the LHC, such as leptoquarks or supersymmetric partners. These particles could significantly enhance the decay rate, up to existing experimental limits, but could also suppress it via quantum interference with the Standard Model (SM) amplitude.
Despite the unprecedented K0S production rate at the LHC, searching for K0S → μ+μ– is challenging due to the low transverse momentum of the two muons, typically of a few hundred MeV/c. Though primarily designed for the study of heavy-flavour particles, LHCb’s unique ability to select low transverse-momentum muons in real time makes the search feasible. According to SM predictions, just two signal events are expected in the Run-2 data, potentially making this the rarest decay ever recorded.
The analysis uses two machine-learning tools: one to discriminate muons from pions, and another to discriminate signal candidates from the so-called combinatorial background that arises from coincidental decays. Additionally, a detailed and data-driven map of the detector material around the interaction point helps to reduce the “fixed-target” background caused by particles interacting with the detector material. A background of K0S → π+π– decays dominates the selection, and in the absence of a compelling signal, an upper limit to the branching fraction of 2.1 × 10–10 has been set at 90% confidence. This is approximately four times more stringent than the previous world-best limit, set by LHCb with Run-1 data. This result has implications for physics models with leptoquarks and some fine-tuned regions of the Minimal Supersymmetric SM.
The upgraded LHCb detector, scheduled to begin operating in 2021 after the present long shutdown of the LHC, will offer excellent opportunities to improve the precision of this search and eventually find a signal. In addition to the increased luminosity, the LHCb upgrade will have a full software trigger, which is expected to significantly improve the signal efficiency for K0S → μ+μ– and other decays with very soft final-state particles.
The 3 km-high summit of Cerro Armazones, located in the Atacama desert of Northern Chile, is a construction site for one of most ambitious projects ever mounted by astronomers: the Extremely Large Telescope (ELT). Scheduled for first light in 2025, the ELT is centred around a 39 m-diameter main mirror that will gather 250 times more light than the Hubble Space Telescope and use advanced corrective optics to obtain exceptional image quality. It is the latest major facility of the European Southern Observatory (ESO), which has been surveying the southern skies for almost 60 years.
The science goals of the ELT are vast and diverse. Its sheer size will enable the observation of distant objects that are currently beyond reach, allowing astronomers to better understand the formation of the first stars, galaxies and even black holes. The sharpness of its images will also enable a deeper study of extrasolar planets, possibly even the characterisation of their atmospheres. “One new direction may become possible through very high precision spectroscopy – direct detection of the expansion rate of the universe, which would be an amazing feat,” explains Pat Roche of the University of Oxford and former president of the ESO council. “But almost certainly the most exciting results will be from unexpected discoveries.”
Technical challenges
Approved in 2006, civil engineering for the ELT began in 2014. Construction of the 74 m-high, 86 m-diameter dome and the 3400-tonne main structure began in 2019. In January 2018 the first segments of the main mirror were successfully cast, marking the first step of a challenging five-mirror system that goes beyond the traditional two-mirror “Gregorian” design. The introduction of a third powered mirror delivers a focal plane that remains un-aberrated at all field locations, while a fourth and a fifth mirror correct distortions in real-time due to the Earth’s atmosphere or other external factors. This novel arrangement, combined with the sheer size of the ELT, makes almost every aspect of the design particularly challenging.
The main mirror is itself a monumental enterprise; it consists of 798 hexagonal segments, each measuring approximately 1.4 m across and 50 mm thick. To keep the surface unchanged by external factors such as temperature or wind, each segment has edge sensors measuring its location within a few nanometres – the most accurate ever used in a telescope. The construction and polishing of the segments, as well as the edge sensors, is a demanding task and only possible thanks to the collaboration with industry; at least seven private companies are working on the main mirror alone. The size of the mirror was originally 42 m, but it was later reduced to 39 m, mainly for costs reasons, but still allowing the ELT to fulfill its main scientific goals. “The ELT is ESO’s largest project and we have to ensure that it can be constructed and operated within the available budget,” says Roche. “A great deal of careful planning and design, most of it with input from industry, was undertaken to understand the costs and the cost drivers, and the choice of primary mirror diameter emerged from these analyses.”
The task is not much easier for the other mirrors. The secondary mirror, measuring 4 m across, is highly convex and will be the largest secondary mirror ever employed on a telescope and the largest convex mirror ever produced. The ELT’s tertiary mirror also has a curved surface, contrary to more traditional designs. The fourth mirror will be the largest adaptive mirror ever made, supported by more than 5000 actuators that will deform and adjust its shape in real-time to achieve a factor-500 improvement in resolution.
Currently 28 companies are actively collaborating on different parts of the ELT design; most of these companies are European, but also include contracts with the Chilean companies ICAFAL, for the road and platform construction, and Abengoa for the ELT technical facility. Among the European contracts, the construction of the telescope dome and main structure by the Italian ACe consortium of Astraldi and Cimolai is the largest in ESO’s history. The total cost estimate for the baseline design of the ELT is €1.174 billion, while the running cost is estimated to be around €50 million per year. Since the approval of the ELT, ESO has increased its number of member states from 14 to 16, with Poland and Ireland incorporating in 2015 and 2018, respectively. Chile is a host state and Australia a strategic partner.
European Southern Observatory’s particle-physics roots
The ELT’s success lies in ESO’s vast experience in the construction of innovative telescopes. The idea for ESO, a 16-nation intergovernmental organisation for research in ground-based astronomy, was conceived in 1954 with the aim of creating a European observatory dedicated to observations of the southern sky. At the time, the largest such facilities had an aperture of about 2 m; more than 50 years later, ESO is responsible for a variety of observatories, including its first telescope at La Silla, not far from Cerro Armazones (home of the ELT).
Like CERN, ESO was born in the aftermath of the war to allow European countries to develop scientific projects that nations were unable to do on their own. The similarities are by no means a mere coincidence. From the beginning, CERN served as a model regarding important administrative aspects of the organisation, such as the council delegate structure, the finance base or personnel regulations. A stronger collaboration ensued in 1969, when ESO approached CERN to assist with the powerful and sophisticated instrumentation of its 3.6 m telescope and other challenges ESO was facing, both administrative and technological. This collaboration saw ESO facilities established at CERN: the Telescope Project Division and, a few years later, ESO’s Sky Atlas Laboratory. A similar collaboration has since been organised for EMBL and, more recently for a new hadron-therapy facility in Southeast Europe.
Unprecedented view
A telescope of this scale has never been attempted before in astronomy. Not only must the ELT be constructed and operated within the available budget, but it should not impact the operation of ESO’s current flagship facilities (such as the VLT, the VLT interferometer and the ALMA observatory).
The amount of data produced by the ELT is estimated to be around 1-2 TB per night, including scientific observations plus calibration observations. The data will be analysed automatically, and users have the option to download the processed data or, if needed, download the original data and process it in their own research centres. To secure observation time with the facility, ESO makes a call for proposals once or twice a year, at which researchers propose desired observations according to their own fields. “A committee of astronomers then evaluates the proposals and ranks them according to their relevance and potential scientific impact, the highest ranked ones are then chosen to be followed,” explains project scientist Miguel Pereira of the University of Oxford.
Currently, 28 companies are actively collaborating on different parts of the ELT design, mostly from Europe
In addition to its astronomical goals, the ELT will contribute to the growing confluence of cosmology and fundamental physics. Specifically, it will help elucidate the nature of dark energy by identifying distant type 1a supernovae, which serve as excellent markers of the universe’s expansion history. The ELT will also measure the change in redshift with time of distant objects – a feat that is beyond the capabilities of current telescopes – to indicate the rate of expansion. Possible variations over time of fundamental physics constants, such as the fine-structure constant and the strong coupling constant, will also be targeted. Such measurements are very challenging because the strength of the constraint on the variability depends critically on the accuracy of the wavelength calibration. The ELT’s ultra-stable high-resolution spectrograph aims to remove the systematic uncertainty currently present in the wavelength calibration measurements, offering the possibility to make an unambiguous detection of such variations.
The ELT construction is on schedule for completion, and first light is expected in 2025. “In the end, projects succeed because of the people who design, build and support them,” Roche says, attributing the success of the ELT to rigorous attention to design and analysis across all aspects of the project. The road ahead is still challenging and full of obstacles, but, as the former director of the Paris observatory André Danjon wrote to his counterpart at the Leiden Observatory, Jan Oort, in 1962: “L’astronomie est bien l’ecole de la patience.” No doubt the ELT will pay extraordinary scientific rewards.
The first joint meeting of the European Committee for Future Accelerators (ECFA), the Nuclear Physics European Collaboration Committee (NuPECC), and the Astroparticle Physics European Consortium (APPEC) took place from 14 – 16 October in Orsay, France. Making progress in domains such as dark matter, neutrinos and gravitational waves increasingly requires interdisciplinary approaches to scientific and technological challenges, and the new Joint ECFA-NuPECC-APPEC Seminar (JENAS) events are designed to reinforce links between astroparticle, nuclear and particle physicists.
Jointly organised by LAL-Orsay, IPN-Orsay, CSNSM-Orsay, IRFU-Saclay and LPNHE-Paris, the inaugural JENAS meeting saw 230 junior and senior members of the three communities discuss overlapping interests. Readout electronics, silicon photomultipliers, big-data computing and artificial intelligence were just a handful of the topics discussed. For example, the technological evolution of silicon photomultipliers, which are capable of measuring single-photon light signals and can operate at low voltage and in magnetic fields, will be key both for novel calorimeters and timing detectors at the high-luminosity LHC. They will also be used in the Cherenkov Telescope Array – an observatory of more than 100 telescopes which will be installed at La Palma in the northern hemisphere, and in the Atacama Desert in the southern hemisphere, becoming the world’s most powerful instrument for ground-based gamma-ray astronomy.
As chairs of the three consortia, we issued a call for novel expressions of interest
Organisational synergies related to education, outreach, open science, open software and careers are also readily identified, and a diversity charter was launched by the three consortia, whereby statistics on relevant parameters will be collected at each conference and workshop in the three subfields. This will allow the communities to verify how well we embrace diversity.
As chairs of the three consortia, we issued a call for novel expressions of interest to tackle common challenges in subjects as diverse as computing and the search for dark matter. Members of the high-energy physics and related communities can submit their ideas, in particular those concerning synergies in technology, physics, organisation and applications. APPEC, ECFA and NuPECC will discuss and propose actions in advance of the next JENAS event in 2021.
Gamma-ray bursts (GRBs) are the brightest electromagnetic events in the universe since the Big Bang. First detected in 1967, GRBs have been observed about once per day using a range of instruments, allowing astrophysicists to gain a deeper understanding of their origin. As often happens, 14 January 2019 saw the detection of three GRBs. While the first two were not of particular interest, the unprecedented energy of photons emitted by the third – measured by the MAGIC telescopes — provides a new insight into these mysterious phenomena.
The study of GRBs is unique, both because GRBs occur at random locations and times and because each GRB has different time characteristics and energy spectra. GRBs consist of two phases: a prompt phase, lasting from hundreds of milliseconds to hundreds of seconds, which consists of one or several bright bursts of hard X-rays and gamma-rays; followed by a significantly weaker “afterglow” phase which can be observed at lower energies ranging from radio to X-rays and lasts for periods up to months.
The recent detection adds yet another messenger: TeV photons
Since the late 1990, optical observations have confirmed both that GRBs happen in other galaxies and that longer duration GRBs tend to be associated with supernovae, strongly hinting that they result from the death of massive stars. Shorter GRBs, meanwhile, have recently been shown to be the result of neutron-star mergers thanks to the first joint observations of a GRB with a gravitational wave event in 2017. While this event is often regarded as the start of multi-messenger astrophysics, the recent detection of GRB190114C lying 4.5 billion light years from Earth adds yet another messenger to the field of GRB astrophysics: TeV photons.
The MAGIC telescopes on the island of La Palma measure Cherenkov radiation produced when TeV photons induce electromagnetic showers after interacting with the Earth’s atmosphere. During the past 15 years, MAGIC has discovered a range of astrophysical sources via their emission at these extreme energies. However, detecting the emission from GRBs, despite over 100 attempts, remained elusive despite theoretical predictions that such emission could exist.
On 14 January, based on an alert provided by space-based gamma-ray detectors, the MAGIC telescopes started repointing within a few tens of seconds of the onset of the GRB. Within the next half hour, the telescopes had observed around a 1000 high energy photons from the source. This emission, which has long been predicted by theorists, is shown by the collaboration to be the result of the “synchrotron self-Compton” process, whereby high-energy electrons accelerated in the initial violent explosion interact with magnetic fields produced by the collision between these ejecta and interstellar matter. The synchrotron emission from this interaction produces the afterglow observed at X-ray, optical and radio energies. However, some of these synchrotron photons subsequently undergo inverse Compton scattering with the same electrons, allowing them to reach TeV energies. These measurements by MAGIC show for the first time that indeed this mechanism does occur. Given the many observations in the past where it wasn’t observed, it appears to be yet another feature which differs between GRBs.
The MAGIC results were published in an issue of Nature which also reported a discovery of similar emission in a different GRB by another Cherenkov telescope: the High Energy Stereoscopic System (H.E.S.S) in Namibia. While the measurements are consistent, it is interesting to note that the measurements by H.E.S.S were made ten hours after that particular GRB, showing that this type of emission can occur also at much later time scales. With two new large-scale Cherenkov observatories – the Large High Altitude Air Shower Observatory in China and the global Cherenkov Telescope Array — about to commence data taking, the field of GRB astrophysics can now expect a range of new discoveries.
In 1871, James Clerk Maxwell undertook the titanic enterprise of planning a new physics laboratory for the University of Cambridge from scratch. To avoid mistakes, he visited the Clarendon Laboratory in Oxford, and the laboratory of William Thomson (Lord Kelvin) in Glasgow – then the best research institutes in the country – to learn all that he could from their experiences. Almost 150 years later, Malcolm Longair, a renowned astrophysicist and the Cavendish laboratory’s head from 1997 to 2005, has written a monumental account of the scientific achievements of those who researched, worked and taught at a laboratory which has become an indispensable part of the machinery of modern science.
The 22 chapters of the book are organised in ten parts corresponding to the inspiring figures who led the laboratory through the years, most famously among them Maxwell himself, Thomson, Rutherford, Bragg, Mott and few others. The numerous Nobel laureates who spent part of their careers at the Cavendish are also nicely characterised, among them Chadwick, Appleton, Kapitsa, Cockcroft and Walton, Blackett, Watson and Crick, Cormack, and, last but not least Didier Queloz, Nobel Laureate in 2019 and professor at the universities of Cambridge and Geneva. You may even read about friends and collaborators as the exposition includes the most recent achievements of the laboratory.
Rutherford and Thomson managed the finances of the laboratory almost from their personal cheque book
Besides the accuracy of the scientific descriptions and the sharpness of the ideas, this book inaugurates a useful compromise that might inspire future science historians. So far it was customary to write biographies (or collected works) of leading scientists and extensive histories of various laboratories: here these two complementary aspects are happily married in a way that may lead to further insights on the genesis of crucial discoveries. Longair elucidates the physics with a competent care that is often difficult to find. His exciting accounts will stimulate an avalanche of thoughts on the development of modern science. By returning to a time when Rutherford and Thomson managed the finances of the laboratory almost from their personal cheque book, this book will stimulate readers to reflect on the interplay between science, management and technology.
History is often instrumental in understanding where we come from, but it cannot reliably predict directions for the future. Nevertheless the history of the Cavendish shows that lasting progress can come from diversity of opinion, the inclusiveness of practices and mutual respect between fundamental sciences. How can we sum up the secret of the scientific successes described in this book? A tentative recipe might be unity in necessary things, freedom in doubtful ones and respect for every honest scientific endeavour.
ATLAS – A 25 Year Insider Story of the LHC Experiment is a comprehensive overview of one of the most complex and successful scientific endeavours ever undertaken. 117 authors collaborated to write on diverse aspects of the ATLAS project, ranging from the early days of the proto-collaboration, to the test-beam studies to verify detector concepts, the design, building and installation of the detector systems, building the event selection and computing environment required, forming the organisation, and finally summarising the harvest of physics gathered thus far. Some of the chapters cover topics that are discussed elsewhere – the description of the detector summarises more extensive journal publications, the major physics achievements have been covered in recent review articles and the organisational structure is discussed on the web – but this volume usefully brings these various aspects together in a single place with a unified treatment.
Despite the many authors who contributed to this book, the style and level of treatment is reasonably coherent. There are many figures and pictures that augment the text. Those showing detector elements that are now buried out of sight are important complements to the text descriptions: the pictures of circuit boards are less helpful, besides demonstrating that these electronics exist. A most engaging feature is the inclusion of one-page “stories” at the ends of the chapters, each giving some insight into the ups and downs of how the enterprise works. Among these vignettes we have such stories as the ATLAS collaboration week that almost no one attended and the spirit of camaraderie among the experimenters and accelerator operators at the daily 08:30 run meetings.
One could imagine several audiences for this book, and I suspect that, apart from ATLAS collaborators themselves, each reader will find different chapters most suited to their interests. The 26-page chapter “The ATLAS Detector Today” offers a more accessible overview for students just joining the collaboration than the 300-page journal publication referenced in most ATLAS publications. Similarly, “Towards the High-Luminosity LHC” gives a helpful brief introduction to the planned upgrades. “Building up the Collaboration” will be useful to historians of science seeking to understand how scientists, institutions and funding agencies engage in a project whose time is ripe. Those interested in project management will find “Detector Construction Around the World” illuminating: this chapter shows how the design and fabrication of detector subsystems is organised with several, often geographically disparate, institutions joining together, each contributing according to its unique talents. “From the LoI to the Detector Construction” and “Installation of the Detectors and Technical Coordination” will appeal to engineers and technical managers. The chapters “Towards the ATLAS Letter of Intent” and “From Test Beams to First Physics” catalogue the steps that were necessary to realise the collaboration and experiment, but whose details are primarily interesting to those who lived through those epochs. Finally, “Highlights of Physics Results (2010 – 2018)” could have offered an exciting story for non-scientists, and indeed the thrill of the chase for the Higgs boson comes through vividly, but with unexplained mentions of leptons, loops and quantum corrections, the treatment is at a more technical level than would be ideal for such readers, and the plots plucked from publications are not best suited to convey what was learned to non-physicists.
What makes a collaboration like this tick?
Given the words in the foreword that the book is “intended to provide an insider story covering all aspects of this global science project,” I looked forward to the final chapter, “ATLAS Collaboration: Life and its Place in Society”, to get a sense of the human dimensions of the collaboration. While some of that discussion is quite interesting – the collaboration’s demographics and the various outreach activities undertaken to engage the public – there is a missing element that I would have appreciated: what makes a collaboration like this tick? How did the large personalities involved manage to come to common decisions and compromises on the detector designs? How do physicists from nations and cultures that are at odds with each other on the world stage manage to work together constructively? How does one account for the distinct personalities that each large scientific collaboration acquires? Why does every eligible author sign all ATLAS papers, rather than just those who did the reported analysis? How does the politics for choosing the collaboration management work? Were there design choices that came to be regretted in the light of subsequent experience? In addition to the numerous successes, were there failures? Although I recognise that discussing these more intimate details runs counter to the spirit of such large collaborations, in which one seeks to damp out as much internal conflict as possible, examining some of them would have made for a more compelling book for the non-specialist.
The authors should be commended for writing a book unlike any other I know of. It brings together a factual account of all aspects of ATLAS’s first 25 years. Perhaps as time passes and the participants mellow, the companion story of the how, in addition to the what and where, will also be written.
Concise and accessible, Calorimetry for Collider Physics is a reference book worthy of the name. Well known experts Michele Livan and Richard Wigmans have written an up-to-date introduction to both the fundamental physics and the technical parameters that determine the performance of calorimeters. Students and senior experts alike will be inspired to deepen their study of the characteristics of these instruments – instruments that have become crucial to most contemporary experiments in particle physics.
Following a light and attractive introductory chapter, the reader is invited to refresh his or her knowledge of the interactions of particles with matter. Key topics such as shower development, containment and profile, linearity and energy resolution are discussed for both electromagnetic and hadronic components. The authors provide illustrations with test-beam results and detailed Monte Carlo simulations. Practical and numerical examples help the reader to understand even counterintuitive effects, stimulating critical thinking in detector designers, and helping the reader develop a feeling for the importance of the various parameters that affect calorimetry.
The authors do not shy away from criticising calorimetric approaches
An important part of the book is devoted to hadron calorimetry. The authors have made a remarkably strong impact in understanding the fundamental problems with large set-ups in test beams, for example the famous lead-fibre sampling spaghetti calorimeter SPACAL. Among other issues, they correct “myths” as to which processes really cause compensation, and discuss quantities that correlate to the invisible energy fraction from hadrons involved in the shower process, for example, to measure the electromagnetic shower fraction event-by-event. The topical development of the dual-readout calorimeter concept follows logically from there – a very promising future direction for this central detector component, as the book discusses in considerable detail. This technology would avoid the question of longitudinal segmentation, which has a particular impact on linearity and calibration.
Livan and Wigmans’ book also gives a valuable historical overview of the field, and corrects several erroneous interpretations of past experimental results. The authors do not shy away from criticising calorimetric approaches in former, present and planned experiments, making the book “juicy” reading for experts. The reader will not be surprised that the authors are, for example, rather critical about highly segmented calorimeters aiming at particle flow approaches.
There is only limited discussion about other aspects of calorimetry, such as triggering, measuring jets and integrating calorimeters into an overall detector concept, which may impose many constraints on their mechanical construction. These aspects were obviously considered beyond the scope of the book, and indeed one cannot pack everything into a single compact textbook, though the authors do include a very handy appendix with tables of parameters relevant to calorimetry.
By addressing the fundamentals of calorimetry, Livan and Wigmans have provided an outstanding reference book. I recommend it highly to everybody interested in basic detector aspects of experimental physics. It is pleasant and stimulating to read, and if in addition it triggers critical thinking, so much the better!
Loop Quantum Gravity and Twistor Theory have a lot in common. They both have quantum gravity as a main objective, they both discard conventional spacetime as the cornerstone of physics, and they have both taken major inspiration from renowned British mathematician Roger Penrose. Interaction between the two communities has been minimal so far, however, due to their distinct research styles: mathematically oriented in Twistor Theory, but focused on empirical support in Loop Gravity. This separation was addressed in the first week of September at a conference held at the Centre for Mathematical Researches (CIRM) at the Campus of Luminy in Marseille, where about a hundred researchers converged for lively debates designed to encouraged cross-fertilisation between the two research lines.
Both Twistor Theory and Loop Gravity regard conventional smooth general-relativistic spacetime as an approximate and emerging notion. Twistor theory was proposed by Roger Penrose as a general geometric framework for physics, with the long-term aim of unifying general relativity and quantum mechanics. The main idea of the theory is to work on the null rays, namely the space of the possible path that a light ray can follow in spacetime, instead of the manifold of the points of physical spacetime. Spacetime points, or events, are then seen as derived objects: they are given by compact holomorphic curves in a complex three-fold: twistor space. It is remarkable how much the main equations of fundamental physics simplify when formulated in these terms. The mathematics of twistors has roots in the 19th century Klein correspondence in projective geometry, and modern Twistor Theory has had a strong impact on pure mathematics, from differential geometry and representation theory to gauge theories and integrable systems.
Could allying twistors and loops be dangerous?
Loop gravity, on the other hand, is a background-independent theory of quantum gravity. That is, it does not treat spacetime as the background on which physics happens, but rather as a dynamical entity itself satisfying quantum theory. The conventional smooth general relativistic spacetime emerges in the classical (ℏ→0) limit, in the same manner as a smooth electromagnetic field satisfying the Maxwell equations emerges from the Fock space of the photons in the classical limit of quantum electrodynamics. Similarly, the full dynamics of classical general relativity is recovered from the quantum dynamics of Loop gravity in the suitable limit. The transitions amplitudes of the theory are finite in the ultraviolet and are expressed as multiple integrals over non compact groups. The theory provides a compelling picture of quantum spacetime. A basis in the Hilbert space of the theory is described by the mathematics of the spin networks: graphs with links labelled by SU(2) irreducible representations, independently introduced by Roger Penrose in the early 1970s in an attempt to a fully discrete combinatorial picture of quantum physical space. Current applications of Loop Gravity include early cosmology, where the possibility of a bounce replacing the Big Bang has been extensively studied using Loop gravity methods, and black holes, where the theory’s amplitudes can be used to study the non-perturbative transition at the end of the Hawking evaporation.
The communities working in Twistors and Loops share technical tools and conceptual pillars, but have evolved independently for many years, with different methods and different intermediate goals. But recent developments discussed at the Marseille conference saw twistors appearing in formulations of the loop gravity amplitudes, confirming the fertility and the versatility of the twistor idea, and raising intriguing questions about possible deeper relations between the two theories.
The conference was a remarkable success. It is not easy to communicate across research programs in contemporary fundamental physics, because a good part of the field is stalled in communities blocked by conflicting assumptions, ingrained prejudices and seldom questioned judgments, making understanding one another difficult. The vibrant atmosphere of the Marseille conference cut through this.
The best moment came during Roger Penrose’s talk. Towards the end of a long and dense presentation of new ideas towards understanding the full space of the solutions of Einstein’s theory using twistors, Roger said rather dramatically that now he was going to present a new big idea that might lead to the twistor version of the full Einstein equations – but at that precise moment the slide projector exploded in a cloud of smoke, with sparks flying. We all thought for a moment that a secret power of the Universe, worried about being unmasked, had interfered. Could allying twistors and loops be dangerous?
An open letter addressed to the presidents of the European Parliament and the European Commission (EC) demanding better recognition for education and research has closed, having attracted around 13,600 signatories during the past two months.
Published on 17 September by a group of eight prominent particle physicists in Europe – Siegfried Bethke (MPI for Physics), Nora Brambilla (TU-München), Aldo Deandrea (U-Lyon 1), Carlo Guaraldo (INFN Frascati), Luciano Maiani (U-Roma La Sapienza), Antonio Pich (U-València), Alexander Rothkopf (U-Stavanger) and Johanna Stachel (U-Heidelberg) – the letter followed the announcement of a new EC organisational structure on 10 September in which former EC directorates for education, culture, sports and youth, as well as that for research, science and innovation, have been subsumed under a single commissioner with the titular brief “innovation and youth”.
“Words are important,” says Maiani, who was CERN Director-General from 1999–2003. “Omitting ‘research’ from the logo of the EC is reason for concern. The response we received, including from prestigious personalities, reassured us that this concern is widely shared.”
With signatories including hundreds of university and laboratory leaders, 19 Nobel laureates and many institutions including the European, French and German physical societies, the letter demands that the EC revises the title of the brief to “Education, Research, Innovation and Youth”. It states: “We, as members of the scientific community of Europe, wish to address this situation early on and emphasise both to the general public, as well as to relevant politicians on the national and European Union level, that without dedication to education and research there will neither exist a sound basis for innovation in Europe, nor can we fulfill the promise of a high standard of living for the citizens of Europe in a fierce global competition.”
Of course we are disappointed that the voices of more than 13,600 scientists went unheard
Johanna Stachel, University of Heidelberg
The letter closed on 13 November after the EC issued a press release stating that it will rename three commissioner portfolios, but that the title of commissioner designate for innovation and youth, Mariya Gabriel, is not among those three being changed. “Naturally we are disappointed, and even frightened as the decisions about non-renaming prove that the omission of research and education in the title signals how low these fields may be valued by the new commission,” says Bethke.
“We will keep pushing,” adds Stachel. “But of course we are disappointed that the voices of more than 13,600 scientists went unheard, despite many prominent voices and also significant press coverage.”
On 18 November, Rothkopf responded to European parliament president David Sassoli on behalf of the initial signatories with a letter, stating: “It is with great disappointment that we recognize that the voice of science has not reached the ears of the Commission. The intention of the Commission is to stimulate innovation. But we reiterate with force that without research and education there is no future to innovation… We are counting on you, Mr. President, to represent the voice of all European citizens who have signed up as supporter of the open letter and in the interest of European research.”
Update 28th November: Speaking at a plenary session of the European Parliament in Strasbourg on 27 November, European Commission (EC) president-elect Ursula von der Leyen announced that the brief of commissioner Mariya Gabriel would be been renamed “Innovation, research, culture, education and youth”. The addition of “education” and “research” to the initial title of the brief announced on 10 September was met with applause in the chamber.
“I am less interested in mathematics than in mathematicians,” wrote Simone Weil to her brother André, a world-class mathematician who was imprisoned in Rouen at the time. The same might be said about US novelist and onetime mathematics student Karen Olsson. Despite the title, her new book, The Weil Conjectures, stars the extraordinary siblings at the expense of André’s mathematical creation.
First conceived by André in prison, and finally proven three decades later by Pierre Deligne in 1974, the Weil conjectures are foundational pillars of algebraic geometry. Linking the continuous and the discrete, and the realms of topology and number theory, they are pertinent to efforts to unite gravity with the quantum theories of the other forces. Frustratingly, though, mathematical hobbyists hoping for insights into the conjectures will be disappointed by this book, which instead zeroes in on the people in orbit around the maths.
Olsson is particularly fascinated by Simone Weil. An iconoclastic public intellectual in France, and possessed by an intensely authentic humanity that the author presents as quite alien to André, Simone was nevertheless envious of her brother’s mathematical insight, writing that she “preferred to die than to live without that truth”. Olsson is clearly empathetic, and so, one would suspect, will be most readers in a profession where intellect is all. Whether one is a grad student or a foremost expert in the field, there is always someone smarter, whose insights seem inaccessible.
Physicists may also detect echoes of the current existential crisis in theoretical physics (see Redeeming the role of mathematics) in Simone’s thinking. While she feels that “unless one has exercised one’s mind at the gymnastics of mathematics, one is incapable of precise thought, which amounts to saying that one is good for nothing,” she criticises “the absolute dominion that is exercised over science by the most abstract forms of mathematics.”
Peppered with anecdotes about other mathematicians – Girolamo Cardano is described as a “total dick” – and more a succession of scenes than a biography, the book is as much about Olsson herself as the Weils. The prose zig-zags between vignettes from the author’s own life and the Weils without warning, leaving the reader to search for connections. Facts are unsourced, and readers are left to guess what is historical and what is the author’s impressionistic character portrait. Charming and quirky, the text transforms dusty perceptions of the meetings of the secret Bourbaki society of French mathematicians into scenes of lakeside debauchery and translucent camisoles that are almost reminiscent of Pride and Prejudice. Olsson even takes us into Simone’s dreams, with the conjectures only cropping up at the end of the book. If you limit your reading to the maths and the Weils, the resulting slim volume is a page turner.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.