This book provides a comprehensive introduction to classic field theory, which concerns the generation and interaction of fields and is the logical precursor of quantum field theory. But, while in most university physics programmes students are taught classical mechanics first and then quantum mechanics, quantum field theory is normally not preceded by dedicated classic field theory classes. The author, though, claims that it would be worth giving more room to classical field theory, since it can offer a good way to think about modern physical model building.
The focus is on the relativistic structural elements of field theories, which enable a deeper understanding of Maxwell’s equations and of the electromagnetic field theory. The same also stands for other areas of physics, such as gravity.
The book comprises four chapters and is completed by three appendices. The first chapter provides a review of special relativity, with some in-depth discussion of transformations and invariants. Chapter two focuses on Green’s functions and their role as integral building blocks, offering as examples static problems in electricity and the full wave equation of electromagnetism. In chapter three, Lagrangian mechanics is introduced, together with the notions of a field Lagrangian and of action. The last chapter is dedicated to gravity, another classic field theory. The appendices include mathematical and numerical methods useful for field theories and a short essay on how one can take a compact action and from it develop all the physics known from EM.
Written for advanced-undergraduate and graduate students, this book is meant for dedicated courses on classical field theory, but could also be used in combination with other texts for advanced classes on EM or a course on quantum field theory. It could also be used as a reference text for self-study.
This book is as elegant as it is deep. A masterful tour of the science of light and vision. It goes beyond artificial boundaries between disciplines and presents all aspects of light as it appears in physics, chemistry, biology and the neural sciences.
The text is addressed to undergraduate students, an added challenge to the author, which is met brilliantly. Since many of the biological phenomena involved in our perception of light (in photosynthesis, image formation and image interpretation) happen ultimately at the molecular level, one is introduced rather early to the quantum treatment of the particles that form light: photons. And when they are complemented with the particle-wave duality characteristic of quantum mechanics, it is much easier to understand a large palette of natural phenomena without relying on the classical theory of light, embodied by Maxwell’s equations, whose mathematical structure is far more advanced than what is required. This classical approach has the problem that eventually one needs the quantisation of the electromagnetic field to bring photons into the picture. This would make the text rather unwieldly, and not accessible to a majority of undergraduates or biologists working in the field.
In the same way that the author instructs non-physics students in some basic physics concepts and tools, he also provides physicists with accessible and very clear presentations of many biological phenomena involving light. This is a textbook, not an encyclopaedia, hence a selection of such phenomena is necessary to illustrate the concepts and methods needed to develop the material. There are sections at the end of most chapters containing more advanced topics, and also suggestions for further reading to gain additional insight, or to follow some of the threads left open in the main text of the chapter.
A cursory perusal of the table of contents at the beginning will give the reader an idea of the breadth and depth of material covered. There is a very accessible presentation of the theory of colour, from a physical and biological point of view, and its psychophysical effects. The evolution of the eye and of vision at different stages of animal complexity, imaging, the mechanism of visual transduction and many more topics are elegantly covered in this remarkable book.
The final chapters contain some advanced topics in physics, namely, the treatment of light in the theory of quantum electrodynamics. This is our bread and butter in particle physics, but the presentation is more demanding on the reader than any of the previous chapters.
Unlike chapter zero, which explains the rudiments of probability theory in the standard frequentist and Bayesian approaches that can be understood basically by anyone familiar with high-school mathematics, chapters 12 and 13 require a more substantial background in advanced physics and mathematics.
The gestalt approach advocated by this book provides one of the most insightful, cross-disciplinary texts I have read in many years. It is mesmerising and highly recommendable, and will become a landmark in rigorous, but highly accessible interdisciplinary literature.
This book aims to provide physical sciences students with the computational skills that they will need in their careers and expose them to applications of programming to problems relevant to their field of study. The authors, who are professors of physics at the University of Pittsburgh, decided to write this text to fill a gap in the current scientific literature that they noticed while teaching and training young researchers. Often, graduate students have only basic knowledge of coding, so they have to learn on the fly when asked to solve “real world” problems, like those involved in physics research. Since this way of learning is not optimal and sometimes slow, the authors propose this guide for a more structured study.
Over almost 900 pages, this book introduces readers to modern computational environments, starting from the foundation of object-oriented computing. Parallel computation concepts, protocols and methods are also discussed early in the text, as they are considered essential tools.
The book covers various important topics, including Monte Carlo methods, simulations, graphics for physicists and data modelling, and gives large space to algorithmic techniques. Many chapters are also dedicated to specific physics applications, such as Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, etc. Nearly 400 exercises of varying difficulty complete the text.
Even though most of the examples come from experimental and theoretical physics, this book could also be very useful for students in chemistry, biology, atmospheric science and engineering. Since the numerical methods and applications are sometimes technical, it is particularly appropriate for graduate students.
This book provides an excellent overview of the state of the art of quantum field theory (QFT) applications to condensed-matter physics (CMP). Nevertheless, it is probably not the best choice for a first approach to this wonderful discipline.
QFT is used to describe particles in the relativistic (high-energy intensity) regime, but, as is well known, its methods can also be applied to problems involving many interacting particles – typically electrons. The conventional way of studying solid-state physics and, in particular, silicon devices does not make use of QFT methods due to the success of models in which independent electrons move in a crystalline substrate. Currently, though, we deal with various condensed-matter systems that are impervious to that simple model and could instead profit from QFT tools. Among them: superconductivity beyond the Bardeen–Cooper–Schrieffer approach (high-temperature superconducting cuprates and iron-based superconductors), the quantum Hall effect, conducting polymers, graphene
and silicene.
The author, as he himself states, aims to offer a unified picture of condensed-matter theory and QFT. Thus, he highlights the interplay between these two theories in many examples to show how similar mechanisms operate in different systems, despite being separated by several orders of magnitude in energy. He discusses, for example, the comparison between the Landau–Ginzburg field of a superconductor with the Anderson–Higgs field in the Standard Model. He also explains the not-so-well-known relation between the Yukawa mechanism for mass generation of leptons and quarks, and the Peierls mechanism of gap generation in polyacetylene: the same trilinear interaction between a Dirac field, its conjugate and a scalar field that explains why polyacetylene is an insulator, is responsible for the mass of elementary particles.
The book is structured into three parts. The first covers conventional CMP (at advanced undergraduate level). The second provides a brief review of QFT, with emphasis on the mathematical analysis and methods appropriate for non-trivial many-body systems (as, in particular, in chapters eight and nine, where a classical and a quantum description of topological excitations are given). I found the pages devoted to renormalisation remarkable, in which the author clearly exposes that the renormalisation procedure is a necessity due to the presence of interactions in any QFT, not to that of divergences in a perturbative approach. The heart of the book is part three, composed of 18 chapters where the author discusses the state of the art of condensed-matter systems, such as topological insulators and even quantum computation.
The last chapter is a clear example of the non-conventional approach proposed by the author: going straight to the point, he does not explain the basics of quantum computation, but rather discusses how to preserve the coherence of the quantum states storing information, in order to maintain the unitary evolution of quantum data-processing algorithms. In his words, “the main method of coherence protection involves excitation, having the so-called non-abelian statistics”, which, going back to CMP, takes us to the realm of anyons and Majorana qubits. In my opinion, this book is not suitable for undergraduate or first-year graduate students (for whom I see as more appropriate, the classic Condensed Matter Field Theory by Altland and Simons). Instead, I would keenly recommend this to advanced graduate students and researchers in the field, who will find, in part three, plenty of hot topics that are very well explained and accompanied by complete references.
Since the discovery of high-energy extragalactic neutrinos by the IceCube collaboration in 2013, the hunt for the sources of such extreme cosmic events has been a major focus of neutrino astronomy. Now, in a multi-messenger measurement campaign involving more than 1000 scientists, IceCube and 18 independent partner observatories have identified such a cosmic particle accelerator – providing a first answer to the 100-year-old question concerning the origin of cosmic rays.
On 22 September 2017, IceCube – a cubic-kilometre neutrino detector installed in the 2.8 km-thick ice at the South Pole – registered a neutrino of likely astrophysical origin with a reconstructed energy of about 300 TeV. Within less than a minute from detection, IceCube’s automatic alert system sent a notice to the astronomical community, triggering worldwide follow-up observations. The notice was the 10th alert of this type sent by IceCube to the international astronomy community so far.
The neutrino event pointed to a 0.15 square-degree area in the sky, consistent with the position of a blazar called TXS 0506+056, an active galaxy whose jet points precisely towards Earth. The Fermi gamma-ray satellite found the blazar to be in a flaring state with a rare seven-fold increase in activity around the time of the neutrino event, making it one of the brightest objects in the gamma-ray sky at that moment. The MAGIC gamma-ray telescope in La Palma, Spain, then also recorded gamma rays with energies exceeding hundreds of GeV from the same region.
The convergence of observations convincingly implicates the blazar as the most likely source. A worldwide team from the various observatories involved conducted a statistical analysis to determine whether the correlation between the neutrino and the gamma-ray observations was perhaps just a coincidence, and found the chance for this to be around one in 1000.
Following the 22 September detection, the IceCube team searched the detector’s archival data and discovered a flare of more than a dozen lower-energy neutrinos detected in late 2014 and early 2015, which were also coincident with the blazar position. This independent observation greatly strengthened the initial detection of a single high-energy neutrino and was the start of a growing body of evidence for TXS 0506+056 being the first identified source of high-energy cosmic neutrinos. Furthermore, the distance to the blazar was determined to be about 4 billion light years (redshift z = 0.34) in the course of the follow-up observations, allowing the first luminosity determination for both gamma rays and neutrinos.
“It came as no surprise that on 12 July, the date of the release of the new observations, and on the following days, several studies were published devoted to modelling the source,” says IceCube member Marek Kowalski from DESY. “It will be exciting to watch the high-energy astronomy community develop a coherent picture of the source over the next few months, as well as new strategies to identify similar events more frequently in the future.”
The concerted observational campaign uses instruments located all over the globe and in space, spanning an energy range from radio waves, through visible light to X-rays and gamma rays, as well as neutrinos. It is thus a significant achievement for the nascent field of multi-messenger astronomy. Since neutrinos are produced through the collisions of charged cosmic rays, the new observation implies that active galaxies are also accelerators of charged cosmic-ray particles. “More than a century after the discovery of cosmic rays by Victor Hess in 1912, the IceCube findings have therefore for the first time pinpointed a compelling candidate for an extragalactic source of these high-energy particles,” says IceCube principal investigator Francis Halzen.
Dark energy and dark matter together make up about 95% of the universe, yet we do not know the origin, constituents, or dynamics (apart from gravity) of these substances. Various extensions of the Standard Model (SM) of particle physics predict the existence of new particles as dark-matter candidates. One such model posits the existence of “dark quarks” that are charged under a new QCD-like force. Like normal SM quarks, dark quarks are only found in bound states (such as the dark proton, a stable dark-matter candidate resembling the ordinary proton) and they can only interact with SM quarks via a mediator particle. The similarity between the mechanisms of hadron production in dark and SM QCD would provide a natural explanation for the puzzling closeness of the observed energy densities of dark and baryonic matter.
In an attempt to explain the nature of dark matter, the existence of dark quarks was recently investigated by the CMS collaboration. If dark-QCD mediators were produced in pairs in the CMS detector, their signature would be striking: each mediator particle would decay into one dark quark and one SM quark, both of which hadronise and produce multiple dark and SM pions, respectively. Dark pions can travel sizable distances in the detector before decaying into detectable SM particles. Therefore, the signature would be two ordinary jets originating from the proton–proton collision, and two “emerging jets” composed of multiple neutral particles that decay at a significant distance away from their origin. Signal events could exhibit large missing transverse momentum from decays beyond the acceptance of the CMS detector.
To identify emerging jets, the CMS analysis relies on two discriminants that quantify the displacement of a jet’s constituents from the collision point. One is based on the impact parameters of the tracks associated to the jet; the other is the fraction of a jet’s energy carried by tracks compatible with the primary vertex. Figure 1 shows an event display for an emerging-jet candidate, with two jets containing multiple displaced vertices and consequently tagged by the discriminants. Substantial background is expected from the decays of B mesons and baryons, whose lifetime makes them more likely to pass the discriminating criteria. To model this background, the analysis derives flavour-dependent misidentification probabilities for jets.
This first dedicated search for the emerging jet signature explores a broad dark-QCD parameter space with mediator masses between 0.4 and 2 TeV, dark-pion masses between 1 and 10 GeV, and dark-pion proper decay lengths between 1 mm and 100 cm. The observed number of events in the CMS data is consistent with the background-only expectation, excluding mediator particles with masses of 400–1250 GeV for dark-pion proper decay lengths between 5 and 225 mm (figure 2). While new data are being collected, the quest for dark matter at the LHC is broadening its scope towards new signatures.
When Lyn Evans, project leader of the Large Hadron Collider (LHC), turned up for work at the CERN Control Centre (CCC) at 05:30 on 10 September 2008, he was surprised to find the car park full of satellite trucks. Normally a scene of calm, the facility had become the focus of global media attention, with journalists poised to capture the moment when the LHC switched on. Evans knew the media were coming, but not quite to this extent. A few hours later, as he counted down to the moment when the first beam had made its way through the last of the LHC’s eight sectors, the CCC erupted in cheers – and Evans wasn’t even aware that his impromptu commentary was being beamed live to millions of people. “I thought I was commenting to others on the CERN site,” he recalls. The following weekend, he was walking in the nearby ski town of Megève when a stranger recognised him in the street.
Of all human endeavours that have captured the world’s attention, the events of 10 September 2008 are surely among the most bizarre. After all, this wasn’t something as tangible as sending a person to the Moon. At 10:28 local time on that clear autumn Wednesday, a bunch of subatomic particles made its way around a 27 km-long subterranean tube, and the spectacle was estimated to have reached an audience of more than a billion people. There were record numbers of hits to the CERN homepage, overtaking visits to NASA’s site, in addition to some 2500 television broadcasts and 6000 press articles on the day. The event was dubbed “first-beam day” by CERN and “Big Bang day” by the BBC, which had taken over a room in the CCC and devoted a full day’s coverage on Radio 4. Google turned its logo into a cartoon of a collider – such “doodles” are now commonplace, but it was a coup for CERN back then. It is hard to think of a bigger media event in science in recent times, and it launched particle physics, the LHC and CERN into mainstream culture.
It is all the more incredible that no collision data, and therefore no physics results, were scheduled that day; it was “simply” part of the commissioning period that all new colliders go through. When CERN’s previous hadron collider, the Super Proton Synchrotron, fired up in the summer of 1981, says Evans, there was just him and Carlo Rubbia in the control room. Even the birth of the Large Electron Positron collider in 1989 was a muted affair. The LHC was a different machine in a different era, and its birth offers a crash course in the communication of big-science projects.
News values
Fears that the LHC would create a planet-eating black hole were a key factor behind the enormous media interest, says Roger Highfield, who was science editor of the UK’s TheTelegraph newspaper at the time. “I have no doubt that the public loved all the stuff about the hunt for the secrets of the universe, the romance of the Peter Higgs story and the deluge of superlatives about energy, vacuum and all that,” says Highfield. “But the LHC narrative was taken to a whole new level by the potty claim by doomsayers that it could create a black hole to swallow the Earth. When ‘the biggest and most complex experiment ever devised’ was about to be turned on, it made front-page news, with headlines like, ‘Will the world end on Wednesday?’”.
The conspiracies were rooted in attempts by a handful of individuals to prevent the LHC from starting up in case its collisions would produce a microscopic black hole – one of the outlandish models that the LHC was built to test. That the protons injected into the LHC that day had an energy far lower than that of the then-operational Tevatron collider in the US, and that collisions were not scheduled for weeks afterwards, didn’t seem to get in the way of a good story. Nor, for that matter, did CERN’s efforts to issue scientific reassurances. Indeed, when science editor of TheGuardian, Ian Sample, turned up at CERN on first-beam day, he expected to find protestors chained to the fence outside, or at least waving placards asking physicists not to destroy the planet. “I did not see a single protestor – and I looked for them,” he says. “And yet, inside the building, I remember one TV host doing a piece to camera on how the world might end when the machine switched on. It was a circus that the media played a massive part in creating. It was shameful and it made the media who seriously ran with those stories look like fools.”
The truth is the black-hole hype came long after the LHC had started to capture the public imagination. As the machine and its massive experiments progressed through construction in the early 2000s, the project’s scale and abstract scientific goals offered an appeal to wonder. Though designed to explore a range of phenomena at a new energy frontier, the LHC’s principal quarry, the Higgs boson, had a bite-sized description: the generator of mass. It also had a human angle – a real-life, white-haired Professor Higgs and a handful of other theorists waiting to see if their half-century-old prediction was right, and international teams of thousands working night and day to build the necessary equipment. Nobel laureate Leon Lederman’s 1993 book The God Particle, detailing the quest for the Higgs boson, added a supernatural dimension to the enterprise.
“I am confident that no editor-in-chief of any newspaper in the world truly understood the Higgs field, the meaning or significance of electroweak symmetry breaking, or how the Higgs boson fits into the picture,” continues Sample. “But what they did get was the appeal of hunting for a particle that in their minds explained the origin of mass. It is such an intriguing concept to imagine that we even need to explain the origin of mass. Isn’t it the case that matter just has mass, plain and simple? All of this, in addition to the sheer awe at the engineering and physics achievement, made for an enormously exotic and appealing story.”
There were also more practical reasons for LHC’s media extravaganza, notes Geoff Brumfiel, a reporter at Nature at the time and now a senior editor at National Public Radio in the US. The fact that pretty much every country and region on Earth had somebody working on the LHC meant that there was a local story for thousands of news outlets, he says, plus CERN’s status as a publicly funded institution made it possible for the lab to open up to the world. “There was also great visual appeal: the enormous, colourful detectors, deep underground, teeming with little scientists in hard hats – it just looked cool. That was hugely important for cable news, television documentary producers, etc.” In addition, says Brumfiel, something actually happened on first-beam day – there was something for journalists to see. “That’s always big in the news business. A big new machine was turning on and might or might not work. And when it worked there were lots of happy people to look at and hear.”
Strategy first
Despite the many external factors influencing LHC communications, the switch-on would never have had the huge reach that it did were it not for a dedicated communication strategy, says James Gillies, CERN’s head of communications at the time. It started as far back as 2000, when Dan Brown’s science-fiction novel Angels & Demons, about a plot to blow up the Vatican using antimatter stolen from CERN, was published. “Luckily for us, it didn’t sell, but it alerted us to the fact that the notion that CERN could be dangerous was bubbling up into popular culture,” says Gillies. A few years later, the BBC made a drama documentary called End Day, which examined a range of ways that humanity might not last the century – including a black hole being created at a particle accelerator. Then, when Dan Brown’s next book, The Da Vinci Code, became a bestseller, CERN realised that Angels & Demons would be next on people’s reading list – so it had better act. “That led to one of the most peculiar conversations that I’ve ever had with a CERN director general, and resulted in us featuring fact and fiction in Angels & Demons on the CERN website,” says Gillies. “Our traffic jumped by an order of magnitude overnight and we never looked back.” CERN later played a significant role in the screen adaption of the book, and Sony Pictures included a short film about CERN in its Blu-ray release.
The first dedicated LHC communications strategy was put in place in 2006. The perception of CERN as portrayed in End Day and Angels & Demons was so wide off the mark that is was laughable, says Gillies, so he took it as opportunity to lead the conversation about CERN and be transparent and timely. In addition to actions such as working with science communicators in CERN Member States and beyond, to organise national media visits for key journalists, he says, “the big idea is that we took a conscious decision to do our science in the public eye, to involve people in the adventure of research at the forefront of human knowledge”. Publicly fixing the date for first beam was a high-risk strategy, but it paid off. The scheduled LHC start-up exceeded the expectations of everyone involved. Both proton beams made a full turn around the machine and one beam was captured by the radio-frequency system, showing that it could be accelerated. For the thousands of people working on the LHC and its experiments, it marked the transition from 25 years of preparation to a new era of scientific discovery. But the terrain was about to get tougher.
Once the journalists had departed and the champagne bottles were stacked away, the LHC teams continued with the task of commissioning away from the spotlight, with a view to obtaining collisions as soon as possible. Then, a couple of days after first-beam day, a transformer powering part of the LHC’s cryogenic system failed, forcing a pause in commissioning during which the teams decided to test the last octant of the machine for high-current operations. While ramping the magnets towards 9.3 kA on 19 September, one of the LHC’s 10,000 superconducting-dipole interconnects failed, ultimately damaging roughly 400 m of the machine. Evans described the event, which set operations back by 14 months, as “a kick in the teeth”. But CERN recovered quickly (see “Lessons from the accelerator frontier“) and, today, Evans says that he is glad that the fault was discovered when it was. “It would have been a disaster had it happened five years in. As it was, we didn’t come under criticism. We were pushing the limits of technology.”
The timing of the incident was doubly fortuitous: the same week it took place, US investment bank Lehman Brothers filed for the largest bankruptcy in history, with other banks looking set to follow suit. The world might not have been consumed by a black hole, but the prospect of a distinctly more real financial Armageddon dominated the headlines that week.
To collisions and beyond
The coming to life of the LHC is a thrilling story, a scientific fairy-tale. From its long-awaited completion, to the tense sector-by-sector threading of its first beam in front of millions of people and the incident nine days later that temporarily ruined the party, the LHC finally arrived at a new energy frontier in November 2009 (achieving 1.18 TeV per beam). Its physics programme began in earnest a few months later, on 30 March 2010, at a collision energy of 7 and then 8 TeV. Barely two years later, the LHC produced its first major discovery – the Higgs boson, announced to a packed CERN auditorium on 4 July 2012 by the ATLAS and CMS collaborations and webcast around the world. The discovery was followed by the award of the 2013 Nobel Prize in Physics to Peter Higgs and François Englert. The CERN seminar was the first time that the pair had met, with cameras capturing Higgs wiping a tear from his eye as the significance of the event sunk in. Since 2015, the LHC has been operating at 13 TeV while notching up record levels of performance, and the machine is now being prepared for its high-luminosity upgrade (HL-LHC).
Has the success of LHC communications set the bar too high? The CERN press office tracked a steady increase in the number of LHC-related articles in the period leading up to the switch-on, in addition to an increasing number of visits by the media and the public. Coverage peaked around September 2008, died down a little, then picked up again four years later as the drama of the Higgs-boson discovery started to unfold. When ATLAS and CMS announced the discovery, press coverage exceeded even that of first-beam day. Of the top 10-read items on TheGuardian website, says Sample, stories about the Higgs made up eight or nine of them, when there were plenty of other big news stories around that day. Why? “The absolute competence and dedication and hard work of those scientists and engineers was so refreshing compared to the crooks, bullies, liars and murderers that we write about every day,” he says. “Perhaps people enjoyed reading about something positive, about people doing astounding work, about something far bigger than the world they normally encounter in the news.”
Today, press coverage of the LHC remains higher than it was before the switch-on, with an average of 200 clippings per day worldwide. The number of media visits to CERN, having peaked in around 2008 and 2012, is now at the level that it was before the switch-on, corresponding to around 300 media outlets per year. The LHC’s life so far has also coincided with the explosion of social-media tools. CERN’s first ever tweet, on 7 August 2008, announced the date for first-beam day, and today the lab has more than two million Twitter followers – rising at a rate of around 1000 per day. During the announcement of the Higgs-boson discovery in 2012, CERN’s live tweets reached journalists faster than the press release and helped contribute to worldwide coverage of the news.
Framing the search for the Higgs boson as the LHC’s only physics goal was never the message that CERN intended to put out, but it’s the one that the media latched on to. Echoing others working in the media who were interviewed for this article, Brumfiel thinks that the LHC has largely left the public eye. In terms of the media, he says, “It’s a victim of its own success: it was designed to do one thing, and it’s done it.”
The challenge facing communications at CERN today is how to capitalise on the existing interest while constructing a new or updated narrative of exploration and discovery. After all, in terms of physics measurements, the LHC is only getting into its stride – having collected just 5% of its expected total dataset and with up to two decades of operations still to go. Although the LHC has not yet found any conclusive signs of physics beyond the Standard Model, it is clear from astronomical and other observations that such phenomena are out there, somewhere. In the absence of direct discoveries, identifying the new physics will be a hard slog involving ever more precise measurements of known particles – a much tougher sell to the public, even if it is all part of the same effort to uncover the basic laws of the universe.
“CERN has managed to build upon previous communication successes as the public is already interested, so they can simply strap a camera onto a drone, fly it around and a lot of people will happily watch!” says David Eggleton of the Science Policy Research Unit at the University of Sussex in the UK, who studies leadership and governance in major scientific projects such as the LHC. “But, just like with the scientists, the public is going to need something new and exciting to focus on – even if the pay-off is 10 years in the future, so it depends on how the laboratory wants to strategise – do they want to pitch HL-LHC as the next big machine or is it just going to be articulated as an upgrade with the FCC (Future Circular Collider) becoming the thing to capture the public’s imagination?”
Theoretical physicist and science populariser Sabine Hossenfelder of the Frankfurt Institute for Advanced Studies in Germany thinks the excitement surrounding the switch-on of the LHC has come back to haunt the field, going so far as to label the current situation in particle physics a “PR disaster”. Before the LHC’s launch in 2008, she says, some theorists expressed themselves confident that the collider would produce new particles besides the Higgs boson. “That hasn’t happened. The big proclamations came almost exclusively from theoretical physicists; CERN didn’t promise anything that they didn’t deliver. That is an important distinction, but I am afraid in the public perception the subtler differences won’t matter.”
Cultural icon
At least for now, and in some countries, the LHC has become embedded in popular culture. The term “hadron collider” is the new “rocket science” – a term dropped into commentary and public discourse to denote the pinnacle of human ingenuity. The LHC has inspired books, films, plays, art and, crucially, adverts – in which firms have used high-production visuals to associate their brands with the standards of the LHC. The number of applications for physics degrees, in the UK at least, soared around the time that the LHC switched on, and the event also launched the television career of ATLAS physicist Brian Cox, who went on to further engage a primed public. Annually, around 300,000 people apply to visit CERN, less than half of whom can be accommodated.
If the communications surrounding the LHC have proved one thing, it is that there is an inherent interest among huge swathes of the global population in the substance of particle physics. Highfield, who is now director of external affairs at the Science Museum in the London, sees this on a daily basis. “Although I think physicists would have liked to have seen more surprises, I know from my work at the Science Museum that the public has a huge appetite for smashing physics,” he says. In November 2013, the Science Museum launched Collider, an immersive exhibition that blended theatre, video and sound art with real artefacts from CERN to recreate a visit to the laboratory. The exhibition went on international tour, finishing in Australia in April 2017, having pulled in an audience of more than 600,000 people. “Yes, the public still cares about the quest to reveal the deepest secrets of the cosmos,” says Highfield.
From a communications perspective, the switch-on of the LHC proves the importance of a clear strategy, the rewards from taking risks, and the difficulty in keeping control of a narrative. For Evans, the LHC changed everything. “Of all the machines that I’ve worked on, never before has there been such interest,” he says. “Before the LHC, no one knew what you were talking about. Now, I can get into a cab in New York or speak to an immigration officer in Japan, and they say: oh, cool, you work at CERN?”.
Since the beginning of the LHC physics programme in 2010, the LHCb collaboration has been working to drive down the uncertainty on the least-precisely measured angle of the unitarity triangle, γ. The unitarity triangle exists in the complex plane and its area is a measure of the amount of CP violation in the Standard Model. Mathematically, the triangle represents a requirement that the Cabibbo–Kobayashi–Maskawa (CKM) quark-mixing matrix is unitary, meaning that the number of quarks is conserved in weak interactions and that there are only three generations of quarks. If new physics exists and breaks these assumptions, it would show up as internal inconsistencies in the unitarity triangle – for example, the angles of the triangle might not add up to 180°. Checking the consistency of different measurements of the unitarity triangle is therefore an important test of the SM.
The unitarity triangle exists in the complex plane and its area is a measure of the amount of CP violation in the Standard Model.
Experimentally, γ can be measured through the interference between b̅ → c̅ u s̅ and b̅ → u̅ c s̅ transitions. It is the only CKM angle that is easily accessible in tree-level processes and, as a result, it can be measured with negligible theoretical uncertainty. In the absence of new-physics effects at tree level, a precise measurement of γ can be compared with other observables related to the CKM matrix that are more likely to be affected by physics beyond the SM. Such comparisons are currently limited by the relatively large uncertainty on γ.
LHCb has recently made a model-independent study of the decay mode B±→ DK± (where D could be D0 or D̅0), with the D meson being reconstructed via the decays D → KS0 π+ π− and D → KS0 K+ K−. This measurement is particularly important for determining γ, as it selects a single solution without ambiguities and with small uncertainty. Because the D meson undergoes a three-body decay, the distribution of events across the phase space (the Dalitz plot) carries information about the underlying amplitudes. And since the B±→ DK± amplitudes depend on γ, it is possible to measure γ by comparing the distributions for B+ and B−. In practice, the distributions depend mainly on the amplitudes of the D decay, with only small CP-violating deviations introduced by γ. The measurement therefore demands a good understanding of the magnitudes of the D0 and D̅0 decay amplitudes, as well as the strong phase differences between them, δD. The former comes from high-statistics calibration channels, and the latter from external measurements performed by the CLEO collaboration.
The new measurement uses 2 fb−1 of proton–proton collision data taken in 2015 and 2016, with signals of about 4100 B±→ D K± decays for the more copious D → KS0 π+ π− mode and about 560 for D → KS0 K+ K−. LHCb found γ = 87+11–12 °, which is consistent with the previous world average, as well as measuring other decay parameters rB = 0.087+0.013–0.014and δB = 101±11°. This is the most precise determination of γ from a single analysis, and LHCb has performed several other measurements of γ, each providing different constraints. Their combination (γ = 74.0+5.0–5.8°, see figure) dominates the current world average and allows increasingly precise tests for new physics by probing the internal consistency of the unitarity triangle.
When the Large Hadron Collider (LHC) started up a decade ago, on 10 September 2008, those of us who had been involved in designing and building the machine were extremely happy. I was in the CERN control centre that day, as was my predecessor, Romeo Perin, who first led the study for the LHC magnet design. Also present was Carlo Rubbia, without whose vision and battling spirit we would not have the LHC today, along with other former CERN Directors-General: Herwig Schopper, under whom initial discussions and workshops took place; Christopher Llewellyn-Smith, who got the LHC approved and secured international collaboration; Luciano Maiani, who took the decision to close LEP to make way for the LHC; and Robert Aymar, who saw the LHC to first beam.
That more than 300 journalists were present made it even more remarkable. CERN opened up to the world as it never did before and news of the event reached more than a billion people (see “The day the world switched on to particle physics“). But for many of us, our heads were already in the future, thinking about what was then simply called the LHC upgrade.
The first paper proposing a luminosity upgrade of the LHC was written in 1994 – the year the LHC was approved – and was inspired by Giorgio Brianti, who led the LHC design effort until the baton passed to Lyn Evans in 1993. He envisaged the benefits from a future superconductor, made of niobium tin rather than the niobium-titanium used in the LHC dipoles, to increase the luminosity of the LHC. Later, I proposed that INFN carry out research on this conductor and I have worked on the high-luminosity LHC (HL-LHC) ever since.
From a technical point of view, the LHC was a turning point in collider design, demanding a vast quantity of superconducting magnets with unprecedented field strengths. The past 10 years has also taught us that the machine is very well designed indeed (thanks Lyn!). The LHC works so well, in fact, that it’s easy for operators to forget that it is a superconducting machine. I realised immediately when those first protons made their way around the ring that the LHC is, as we say in Italian, “bionic”: it can do things beyond our expectations.
But we also learned, nine days later, when the breakdown of an electrical interconnect led to significant damage to one section of the machine, that the LHC is very fragile. This we will never forget: we have to be careful and we have to be humble, as a machine of this scale and complexity can stop working at any moment. The incident on 19 September happened in an interconnect between two magnets; it was not a technical difficulty but one of the LHC’s innumerable complex interfaces that tricked us. Bad as it was, we could repair it in a reasonable time period. Had we made a technical mistake concerning the superconductor itself, or the basic magnet design, it would have been a potential show-stopper.
The LHC is so complicated that it’s impossible not to make mistakes. What’s important is that we learn from them. Not only did we learn how to repair and to react very fast, we also learned how to work cooperatively. It was a healthy sociological exercise for CERN, where, like in any large organisation, it is easy for people, divisions and departments to insulate themselves from others. After the incident, the spirit of collaboration has clearly been stronger.
This is definitely the case for the HL-LHC, where we are working not only on how to produce higher luminosity, but also on how to make it the most effective for the LHC experiments. In addition, and in contrast to the LHC, which was first approved and then sought partnership, HL-LHC has been a partnership with other institutions since the very beginning. It is another good lesson, and one that paves the way for a future supercollider beyond the LHC.
We are now exactly halfway through the HL-LHC programme, which was set up as a design study in 2010, approved with full budget in 2016, and will start operating in 2026. Now we have to complete the second half of the journey to generate a luminous future for our young colleagues. I will be long retired when the HL-LHC switches on, but I expect that, when it does, the CERN control centre will once again be a scene of celebration.
The ALICE collaboration has released a new measurement of the production of D0, D+, D*+ and Ds+ mesons, which contain a charm quark, in lead–lead (PbPb) collisions at a centre-of-mass energy per nucleon pair (√sNN) of 5.02 TeV. These measurements probe the propagation of charm quarks in the quark–gluon plasma (QGP) produced in high-energy heavy-ion collisions. Charm quarks are produced early in the collision and subsequently experience the whole system evolution, losing part of their energy via inelastic (gluon radiation) or elastic (“collisional”) scattering processes. The charm quarks emerge from the collision in D mesons, which are identified by their characteristic decays.
The result is reported in terms of the nuclear modification factor (RAA), which is the ratio between the measured pT distribution in heavy-ion and proton–proton (pp) collisions, scaled by the average number of binary nucleon–nucleon collisions in each nuclear collision. The figure shows the average RAA of non-strange D mesons (D0, D+, D*+) and strange (D+s) mesons in central (0–10%) PbPb collisions. For the non-strange mesons, a minimum of RAA ≈ 0.2 for pT = 6–10 GeV/c indicates a significant energy loss for charm quarks. The RAA is compatible with that of charged particles for pT > 8 GeV/c, while it is larger at lower pT. The comparison to light-flavour hadrons helps to study the colour-charge and quark-mass dependence of the in-medium parton energy loss.
The RAA of Ds+ mesons is larger than that of non-strange D mesons. Though the experimental uncertainty is still large, such a difference would suggest that charm quarks also form hadrons by recombining with the surrounding light quarks in the QGP. This mechanism differs from the fragmentation process that is thought to be the main hadronisation mechanism in the absence of a medium. The recombination mechanism enhances the yield of particles with strangeness because strange quarks are copiously produced in the QGP.
The RAA at LHC Run 2 is compatible with that measured at a lower centre-of-mass energy per nucleon pair of 2.76 TeV, but the larger collected data sample at Run 2 made it possible to reduce the uncertainties by a factor of about two. A similar suppression at the two energies is expected by the “Djordjevic model” (figure, right) due to the combination of a stronger suppression in the denser medium and a harder pT distribution at 5.02 TeV with respect to 2.76 TeV.
The next PbPb run at the end of 2018, and the subsequent upgrade of the ALICE detector, will allow us to improve the measurement. This will shed further light on the energy loss and hadronisation of heavy quarks in the QGP and allow researchers to determine the transport coefficients describing the scattering power of the QGP and the diffusion of charm quarks in the medium.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.