Comsol -leaderboard other pages

Topics

Foundations of High-Energy-Density Physics: Physical Processes of Matter at Extreme Conditions

By Jon Larsen
Cambridge University Press

This book provides a comprehensive overview of high-energy-density physics (HEDP), which concerns the dynamics of matter at extreme temperatures and densities. Such matter is present in stars, active galaxies and planetary interiors, while on Earth it is not found in normal conditions, but only in the explosion of nuclear weapons and in laboratories using high-powered lasers or pulsed-power machines.

After introducing, in the first three chapters, many fundamental physics concepts necessary to the understanding of the rest of the book, the author delves into the subject, covering many key aspects: gas dynamics, ionisation, the equation-of-state description, hydrodynamics, thermal energy transport, radiative transfer and electromagnetic wave–material interactions.

The author is an expert in radiation-hydrodynamics simulations and is known for developing the HYADES code, which is largely used among the HEDP community. This book can be a resource for research scientists and graduate students in physics and astrophysics.

Quantized Detector Networks: The Theory of Observation

By George Jaroszkiewicz
Cambridge University Press

Quantised Detector Networks (QDN) theory was invented to reduce the level of metaphysics in the application of quantum mechanics (QM), moving the focus from the system under observation to the observer and the measurement apparatuses. This approach is based on the consideration that “labstates”, i.e. the states of the system we use for observing, are the only things we can actually deal with, while we have no means to prove that the objects under study “exist” independently of observers or observations.

In this view, QM is not a theory describing objects per se, but a theory of entitlement, which means that it provides physicists with a set of rules defining what an observer is entitled to say in any particular context.

The book is organized in four parts: Basics, Applications, Prospects, and Appendices. The author provides, first of all, the formalism of QDN and then applies it to a number of experiments that show how it differs from standard quantum formalism. In the third part, the prospects for future applications of QDN are discussed, as well as the possibility of constructing a generalised theory of observation. Finally, the appendices collect collateral material referred to at various places in the book.

The aim of the author is to push the readers to look in a different way at the world they live in, to show them the cognitive traps caused by realism – i.e. the assumption that what we observe has an existence independent of our observation – and alerting them that various speculative concepts and theories discussed by some scientists do not actually have empirical basis. In other words, they cannot be experimentally tested.

The Great Silence – The Science and Philosophy of Fermi’s Paradox

By Milan Cirkovic
Oxford University Press

Enrico Fermi formulated his eponymous paradox during a casual lunchtime chat with colleagues in Los Alamos: the great physicist argued that, probabilistically, intelligent extraterrestrial lifeforms had time to develop countless times in the Milky Way, and even to travel across our galaxy multiple times; but if so, where are they?

The author of this book, Milan Cirkovic, claims that, with the wealth of scientific knowledge accumulated in the many decades since then, the paradox is now even more severe. Space travel is not speculative anymore, and we know that planetary systems are common – including Earth-like planets – suggesting that life on our planet started very early and that our solar system is a relative late-comer on the cosmic scene; hence, we should expect many civilisations to have evolved way beyond our current stage. Given the huge numbers involved, Cirkovic remarks, the paradox would not even be completely solved by the discovery of another civilisation: we would still have to figure out where all others are!

The Great Silence aims at an exhaustive review of the solutions proposed to this paradox in the literature (where “literature” is to be understood in the broadest sense, ranging from scholarly astrobiology papers to popular-science essays to science-fiction novels), following a rigorous taxonomic approach. Cirkovic’s taxonomy is built from the analysis of which philosophical assumptions create the paradox in the first place. Relaxing the assumptions of realism, Copernicanism, and gradualism leads, respectively, to the families of solutions that Cirkovic labels “solipsist”, “rare Earth”, and “neocatastrophic”. His fourth and most heterogeneous category of solutions, labelled “logistic”, arises from considering possible universal limitations of physical, economic or metabolic nature.

The book starts by setting a rigorous foundation for discussion, summarising the scientific knowledge and dissecting the philosophical assumptions. Cirkovic does not seem interested in captivating the reader from the start: the preface and the first three chapters are definitely scholarly in their intentions, and assume that the reader already knows a great deal about Fermi’s paradox. As a particularly egregious example, Kardashev’s speculative classification of civilisations, based on the scale of their energy consumption, plays a very important role in this book; one would have therefore expected a discussion about that, somewhere at the beginning. Instead, the interested reader has to resort to a footnote for a succinct definition of the three types of civilisation (Type I: exploiting planetary resources; Type II: using stellar system resources; Type III: using galactic resources).

However, after these introductory chapters, Cirkovic’s writing becomes very pleasant and engaging, and his reasoning unfolds clearly. Chapters four to seven are the core of the book, each of them devoted to the solutions allowed by negating one assumption. Every chapter starts with an analogy with a masterpiece in cinema or literature, followed by a rigorous philosophical definition. Then, the consequent solutions to Fermi’s paradox are reviewed and, finally, a résumé of take-home messages is provided.

This parade of solutions gives a strange feeling: each of them sounds either crazy, or incredibly unlikely, or insufficient to solve the paradox (at least in isolation). Still, once we accept Cirkovic’s premise that Fermi’s paradox means that some deeply rooted assumption cannot be valid, we are compelled to take seriously some outlandish hypothesis. The reader is invited to ponder, for example, how the solution to the paradox might depend on the politics of the Milky Way in the last few billion years: extraterrestrial civilisations may have all converged to a Paranoid Style in Galactic Politics, or we might unknowingly be under the jurisdiction of an Introvert Big Brother (Cirkovic has a talent for catchy titles). Some Great Old Ones might be temporarily asleep, or we (and any conceivable biological intelligence) might be limited in our evolution by some Galactic Stomach-Ache. A large class of very gloomy hypotheses assumes that all our predecessors were wiped out before reaching very high Kardashev’s scores, and Cirkovic seems particularly fond of the idea of swarms of Deadly Probes that may still be roaming around, ready to point at us as soon as they notice our loudness. Unless we reach the aforementioned state of galactic paranoia, which makes for a very nice synergy between two distinct solutions of the paradox.

The author not only classifies the proposed solutions, but also rates them by how fully they would solve this paradox. The concluding chapter elaborates on several philosophical challenges posed by Fermi’s paradox, in particular to Copernicanism, and on the link between it and the future of humanity.

Cirkovic is a vocal (and almost aggressive) critic of most of the SETI-related literature, claiming that it relies on excessive assumptions which strongly limits SETI searches. In his words, the failure of SETI so far has mostly occurred on philosophical and methodological levels. He quotes Kardashev in saying that extraterrestrial civilisations have not been found because they have not really been searched for. Hence Cirkovic’s insistence on a generalisation of targets and search methods.

An underlying theme in this book is the relevance of philosophy for the advancement of science, in particular when a science is in its infancy, as he argues to be the case for astrobiology. Cirkovic draws an analogy with early 20th century cosmology, including a similitude between Fermi’s and Olmert’s paradoxes (the latter being: how can the night sky be dark, if we are reachable by the light of an infinite number of stars in an infinitely old universe?).

I warmly recommend The Great Silence to any curious reader, in spite of its apparent disinterest for a broad readership. In it, Cirkovic makes a convincing case that Fermi’s paradox is a fabulously complex and rich intellectual problem.

Strange Glow: The Story of Radiation

By Timothy J Jorgensen
Princeton University Press

CCNov18_Book-jorgensen

In this book, Timothy Jorgensen, a professor of radiation medicine at Georgetown University in the US, recounts the story of the discovery of radioactivity and how mankind has been transformed by it, with the aim of sweeping away some of the mystery and misunderstanding that surrounds radiation.

The book is structured in three parts. The first is devoted to the discovery of ionising radiation in the late 19th century and its rapid application, notably in the field of medical imaging. The author establishes a vivid parallel with the discovery and exploitation of radio waves, a non-ionising counterpart of higher energy X rays. A dynamic narrative, peppered with personal anecdotes by key actors, succeeds in transmitting the decisive scientific and societal impact of radiation and related discoveries. The interleaving of the history of the discovery with aspects of the lives of inspirational figures such as Ernest Rutherford and Enrico Fermi is certainly very relevant, attractive and illustrative.

In the second part, the author focuses on the impact of ionising radiation on human health, mostly through occupational exposure in different working sectors. A strong focus is on the case of the “radium girls” – female factory workers who were poisoned by radiation from painting watch dials with self-luminous paint. This section also depicts the progress in radiation-protection techniques and the challenges related to quantifying the effects of radiation and establishing limits for the exposure to it. The text succeeds in outlining the difficulties of linking physical quantities of radiation with its impact on human health.

The risk assessment related to radiation exposure and its impact on human health is further covered in the third part of the book. Here, Jorgensen aims to provide quantitative tools for the public to be able to evaluate the benefits and risks associated with radiation exposure. Despite his effort to offer a combination of complementary statistical approaches, readers are left with an impression that many aspects of the impact of radiation on human health are not fully understood. On the contrary, the large number of radiation-exposure cases in the Hiroshima and Nagasaki nuclear bombings, after which it was possible to correlate the absorbed dose with the location of the various victims at the time of the explosion, provides a scientifically valuable sample to study both deterministic and stochastic effects of radiation on human health.

In part three, the book also digresses at length about the role of nuclear weapons in the US defence and geopolitical strategy. This topic seems somewhat misplaced with respect to the more technical and scientific content of the rest of the text. Moreover, it is highly US-centric, often neglecting the analogous role of such weapons in other countries.

It is noteworthy that the book does not cover radiation in space and its crucial impact on human spaceflight. Likewise, the discovery of cosmic radiation through Hess’ balloon experiment in 1911–1912, while constituting an essential finding in addition to the already discovered radioactivity from elements on the Earth’s surface, is completely overlooked.

Despite the lack of space-radiation coverage and the somewhat uncorrelated US defence considerations, this book is definitely a very good read that will satisfy the reader’s curiosity and interest with respect to radiation and its impact on humans. In addition, it provides insight into the more general progress of physics, especially in the first half of the 19th century, in a highly dynamic and entertaining manner.

ROOT’s renovation takes centre stage at Sarajevo meeting

Participants of the ROOT workshop

The 11th ROOT Users’ Workshop was held on 10–13 September in Sarajevo, Bosnia and Herzegovina, at the Academy of Science and Arts: an exceptional setting that also provided an opportunity to involve Bosnia and Herzegovina in CERN’s activities.

The SoFTware Development for Experiments group in the experimental physics department at CERN drives the development of ROOT, a modular software toolkit for processing, analysing and visualising scientific data. ROOT is also a means to read and write data: LHC experiments alone produced about 1 exabyte of data stored in the ROOT file format.

Thousands of high-energy physicists use ROOT daily to produce scientific results. For the ROOT team, this is a big responsibility, especially considering the challenges Run 3 at the LHC and the High Luminosity LHC (HL-LHC) pose to all of us. Luckily, we can rely on a lively user community, whose contribution is so useful that, periodically, a ROOT users’ workshop is organised. The event’s objective is to gather together the ROOT community of users and developers to collect criticism, praise and suggestions: a unique occasion to shape the future of the ROOT project.

More than 100 people attended this year’s workshop, a 30% increase from 2015, making the event a success. What’s more, the diversity of the attendees – students, analysis physicists, software experts and framework developers – brought different levels of expertise to the event. The workshop featured 69 contributions as well as engaging discussions. Software companies participated, with three invited contributions: Peter Müßig from SAP presented OpenUI5, the core of the SAP javascript framework that will be used for ROOT’s graphical user interface; Chandler Carruth from Google discussed ways to make large-scale software projects written in C++, the language for number-crunching code in high-energy and nuclear physics (HENP), simpler, faster and safer; and Sylvain Corlay from Quantstack showed novel ways to tackle numerical analysis with multidimensional array expressions. These speakers said they enjoyed the workshop and plan to come to CERN to extend the collaboration.

ROOT’s renovation was the workshop’s main theme. To be at the bleeding edge of software technology, ROOT – which has been the cornerstone of virtually all HENP software stacks for two decades – is undergoing an intense upgrade of its key components. This effort represents an exciting time for physicists and software developers. In the event, ROOT users expressed their appreciation of the effort to make it easier to use and faster on modern computer architectures, with the sole objective of reducing the time interval between data delivery and the presentation of plots.

In particular, the spotlight was on the modernisation of the I/O subsystem, crucial for the future LHC physics programme; ROOT’s parallelisation, a prerequisite to face Run 3 and HL-LHC analyses; as well as on new graphics, multivariate tools and an interface to the Python language, which are all elements of prime importance for scientists’ everyday work.

The participants’ feedback was enthusiastic, the atmosphere was positive, and the criticism received was constructive and fruitful for the ROOT team. We thank the participating physicists and computer scientists: we appreciated your engagement and are looking forward to organising the next ROOT workshop.

Particle physics meets astrophysics and gravity

ICNFP 2018 participants

The 7th International Conference on New Frontiers in Physics (ICNFP 2018) took place on 4–12 July in Kolymbari, Crete, Greece, bringing together about 250 participants.

The opening talk was given by Slava Mukhanov and was dedicated to Stephen Hawking. To mention some of the five special sessions featured, the memorial session of Lev Lipatov, a leading figure worldwide in the high-energy behaviour of quantum field theory (see CERN Courier January/February 2018 p50), the session on quantum chromodynamics and the round table on the future of fundamental physics chaired by Albert de Roeck, saw a high number of attendees.

Alongside the main conference sessions, there were 10 workshops. Among these, the one on heavy neutral leptons highlighted novel mechanisms for producing sterile-neutrino dark matter and prospects for future searches of such dark matter with the next generation of space-based X-ray telescopes, including Spektr-RG, Hitomi and Athena+.

The workshop on instrumentation and methods in high-energy physics focused on the latest developments and the performance of complex detector systems, including triggering, data acquisition and signal-control systems, with an emphasis on large-scale facilities in nuclear physics, particle physics and astrophysics. This programme attracted many participants and led to the exchange of scientific information between different physics communities.

The workshop on new physics paradigms after the Higgs-boson and gravitational-wave discoveries provided an opportunity both to review results from searches for gravitational waves and to show plans for future precision measurements of Standard Model parameters at the LHC.

The workshop also featured several theory talks covering a wide range of subjects, including the implementation of supersymmetry breaking in string theory, new developments in early-universe cosmology and beyond-Standard Model physics. ICNFP 2018 also saw the first workshop on frontiers in gravitation, astrophysics and cosmology, which strengthened the Asian presence at ICNFP, gathering many participants from the Asia Pacific region.

For the second time in the ICNFP series, a workshop on quantum information and quantum foundations took place, with the aim of promoting discussions and collaborations between theorists and experimentalists working on these topics.

Yakir Aharonov gave a keynote lecture on novel conceptual and practical applications of so-called weak values and weak measurements, showing that they lead to many interesting hitherto-unnoticed phenomena. The latter include, for instance, a “separation” of a particle from its physical variables (such as its spin), emergent correlations between remote parties defying fundamental classical concepts, and a completely top-down hierarchical structure in quantum mechanics, which stands in contrast to the concept of reductionism. As exemplified in the talk of Avshalom Elitzur, the latter could be explained using self-cancelling pairs of positive and negative weak values.

Sandu Popescu, Pawel Horodecki, Marek Czachor and Eliahu Cohen presented many new phenomena involving quantum nonlocality in space and time, which open new avenues for extensive research. Ebrahim Karimi discussed various applications of structured quantum waves carrying orbital angular momentum (either photons or massive particles) and also discussed how to manipulate the topology of optical polarisation knots. Onur Hosten emphasised the importance of cold atoms for quantum metrology.

The workshop also featured many excellent talks discussing the intriguing relations between quantum information and condensed-matter physics or quantum optics. Some connections with quantum gravity, based on entanglement, complexity and quantum thermodynamics, were also discussed. Another topic presented was the comparison between the role of spin and polarisation in high-energy physics and quantum optics. In both of these fields, one should consider the total angular momentum, not the spin alone, and helicity is a very helpful concept in both, too.

Future accelerator facilities such as the low-energy heavy-ion accelerator centres FAIR in Darmstadt, Germany, and NICA at the Joint Institute for Nuclear Research in Dubna, Russia, were also discussed, particularly in the workshop on physics at FAIR-NICA-SPS-BES/RHIC accelerator facilities. Here new ideas as well as overview talks on current and future experiments on the formation and exploration of baryon-rich matter in heavy-ion collisions were presented.

The MoEDAL collaboration at CERN, which searches for highly ionising messengers of new physics such as magnetic monopoles, organised a mini-workshop on highly ionising avatars of new physics. The workshop provided a forum for experimentalists and phenomenologists to meet, discuss and expand this discovery frontier. The latest results from the ATLAS, CMS, MoEDAL and IceCube experiments were presented, and some important developments in theory and phenomenology were introduced for the first time. Highlights of the workshop included monopole production via photon fusion at colliders, searches for heavy neutral leptons and other long-lived particles at the LHC, regularised Kalb–Ramond monopoles with finite energy, and monopole detection techniques using solid-state and Timepix detectors.

Finally, on the education and outreach front, Despina Hatzifotiadou gave LHC “masterclasses” in collaboration with EKFE (the laboratory centre for physical sciences) to 30 high-school students and teachers, who had the opportunity to analyse data from the ALICE experiment and “observe” strangeness enhancement in relativistic heavy-ion collisions.

The next ICNFP conference will take place on 21–30 August 2019 in Kolymbari, Crete, Greece.

Hans Paar 1944–2018

Hans Paar

Hans Paar, emeritus professor of physics at the University of California, San Diego (UCSD), passed away on 17 June after a short illness. Paar was initially trained at Delft University of Technology in his native country of the Netherlands. This engineering background served him well throughout his career, allowing him to take on important tasks in the design, construction and testing of equipment in all the particle-physics experiments he participated in.

Paar started his particle-physics career at Columbia University in the US, where he worked with Leon Lederman on one of the first experiments at Fermilab (E70). After completing his PhD thesis on this project, he relocated to Europe to work as a CERN fellow with another Nobel Laureate, Jack Steinberger, on WA1, the first experiment with the high-energy neutrino beam of the newly commissioned Super Proton Synchrotron (SPS).

In 1978, Paar joined a team at NIKHEF, the Dutch National Institute for Subatomic Physics, that worked on the TPC/2γ experiment at the SLAC National Accelerator Laboratory in the US, and he quickly became one of the leaders of the collaboration that carried out this experiment. His visibility at SLAC led to an offer from the UCSD, which he then joined as a faculty member in 1986 and where he remained for the rest of his career.

Paar was an internationally recognised physicist. He studied the properties of the bottom quark at electron–positron colliders since the early 1990s, first as a member of the CLEO collaboration (Cornell) and later the BaBar collaboration (SLAC). He also made essential contributions to the design and construction of novel types of calorimeters, in the context of the SPACAL and DREAM projects at CERN.

Later in Paar’s career, his research interests included observational cosmology. Paar and his colleagues set out to detect the B-mode polarisation of the cosmic microwave background radiation to address one of the most fundamental problems in astrophysics – the inflation of the early universe. Paar made crucial contributions to the realisation of this project, named POLARBEAR, which is carried out at high altitude in the Atacama Desert in Chile. Not only did he provide expert leadership, design and analysis skills, he also secured a $600,000 private donation, which helped enable the fabrication of the telescope.

Paar cared deeply about education and creating a nurturing, motivating environment for students. He was instrumental in modernising the UCSD’s curriculum on quantum mechanics at all levels and authored the textbook An Introduction to Advanced Quantum Physics. As part of the UCSD Research Experience for Undergraduates programme, he gave a “Physics of Sailing” course consisting of lectures on the physics of the sport, followed by a full day of sailing on San Diego Bay.

Hans had many interests outside of physics. He was also a devoted husband, (step)father and a very good friend to many. He was a gifted piano player and a serious model-train enthusiast. He helped to create an atmosphere of creative thought and friendliness within every group he was part of. Our thoughts go to his wife Kim, his daughter Suzanne and his stepsons Eric and Alain. We all owe Hans for many happy memories.

Suppression of the Λ(1520) resonance in Pb–Pb collisions

Figure 1

The ALICE collaboration has recently reported the first measurement of the hadronic resonance Λ(1520) in heavy-ion collisions at the LHC. In such collisions, a deconfined plasma of quarks and gluons called quark–gluon plasma (QGP) is formed, which expands and cools. Eventually, the system undergoes a transition to a dense hadron gas (hadronisation), which further expands until all interactions among hadrons cease. Short-lived hadronic resonances are sensitive probes of the dynamics and properties of the medium formed after hadronisation. Due to their short lifetimes, they decay when the system is still dense and the decay products scatter in the hadron gas, reducing the observed number of decays.

The production yield of the Λ(1520) baryon resonance was measured at mid-rapidity in lead–lead (Pb–Pb) collisions at a centre of mass energy per nucleon–nucleon pair of 2.76 TeV. The resonance is reconstructed in the Λ(1520)  pK (and its charge-conjugate) hadronic decay channel and its production is measured as a function of the collision centrality. The ratio of the number of
measured Λ(1520) baryons to that of its stable counterpart, Λ, highlights the characteristics of resonance production directly related to the particle lifetime, since possible effects due to valence-quark composition (e.g. strangeness enhancement) cancel in the ratio. A gradual decrease of the Λ(1520)/Λ yield ratio with increasing charged-particle multiplicity is observed from peripheral to central Pb–Pb collisions (see figure).

The result provides the first evidence for Λ(1520) suppression in central heavy-ion collisions compared to peripheral collisions, achieving a 3.1σ confidence level once cancellations of correlated systematics are taken into account. An earlier measurement at lower collision energy by the STAR experiment at Brookhaven’s Relativistic Heavy-Ion Collider showed a similar suppression, but with much larger uncertainties. The ratio of the Λ(1520) resonance yield with respect to non-resonant Λ baryons reduces by about 45% in central collisions compared to peripheral collisions.

The EPOS3 model, which describes the full evolution of a heavy-ion collision and includes re-scattering in the hadronic phase, describes this suppression well, although it systematically overestimates the data. The relative decrease of the Λ(1520) resonance yield is also slightly smaller in the EPOS3 model than observed in the data, suggesting a longer lifetime of the hadronic phase (about 8.5 fm/c in EPOS3), or that the description of the relevant hadronic cross-sections in the transport phase is imprecise. The mean transverse momentum is also shown to increase with increasing charged-particle multiplicity, hence with increasing collision centrality. The EPOS3 model can quantitatively describe this feature. It is noteworthy that the model does not describe the data when the microscopic transport stage responsible for the re-scattering effect inside the hadronic medium (as described by the UrQMD model), is disabled.

In summary, these measurements add further support to the formation of a dense hadronic phase in Pb–Pb collisions, highlighting its relevance and the importance of a microscopic description of the latest stages of the evolution of heavy-ion collisions.

Search for new quarks addresses unnaturalness

Figure 1

The Standard Model (SM) is a triumph of modern physics, with unprecedented success in explaining the subatomic world. The Higgs boson, discovered in 2012, was the capstone of this amazing theory, yet this newly known particle raises many questions.  For example, interactions between the Higgs boson and the top quark should lead to huge quantum corrections to the Higgs boson mass, possibly as large as the Planck mass (>1018 GeV). Why, then, is the observed mass only 125 GeV? Finding a solution to this “hierarchy problem” is one of the top motivations of many new theories of particle physics.

A common feature in several of these theories is the existence of vector-like quarks – in particular, a vector-like top quark (T) that could naturally cancel the large quantum corrections caused by the SM top quark. Like other quarks, vector-like quarks are spin-½ particles that interact via the strong force and, like all spin-½ particles, they have left-handed and right-handed versions. The unique feature of vector-like quarks is their ambidexterity: while the weak force only interacts with left-handed SM particles, it would interact the same way with both the right- and left-handed versions of vector-like quarks. This also gives vector-like quarks more options in how they can decay. Unlike the Standard Model top quark, which almost always decays to a bottom quark and W boson (tWb), a vector-like top quark could decay three different ways: TWb, TZt, or THt.

The search for vector-like quarks in ATLAS spans a wide range of dedicated analyses, each focusing on a particular experimental signature (possibly involving leptons, boosted objects or large missing transverse energy). The breadth of the programme allows ATLAS to be sensitive to most relevant decays of vector-like top quarks, and also those of vector-like bottom quarks, thus increasing the chances of discovery. The creation of particle–antiparticle pairs is the most probable production mechanism for vector-like quarks with mass around or below 1 TeV. For higher masses, single production of vector-like quarks may have a larger rate.

ATLAS recently performed a statistical combination of all the individual searches that looked for pair-production of vector-like quarks. While the individual analyses were designed to be sensitive to particular sets of decays, the combined results provide increased sensitivity to all considered decays of vector-like top quarks with masses up to 1.3 TeV. No vector-like top or bottom quarks were found. The combination allowed ATLAS to set the most stringent exclusion bounds on the mass of a vector-like top quark for arbitrary sets of branching ratios to the three decay modes (figure, left).

As the limits on vector-like quarks reach higher masses, the importance of searching for their single production rises. Such searches are also interesting from a theoretical perspective, since they allow one to constrain parameters of the production model (figure, right).

Given these new strong limits on vector-like quarks and the lack of evidence for supersymmetry, the theoretical case for a naturally light Higgs boson is not looking good! But nature probably still has a few tricks up her sleeve to get out of this conundrum.

CMS detects first production of top quark and photon

Figure 1

It is well known that the top quark, the heaviest known elementary particle, plays an important role in electroweak-symmetry breaking, and is also one of the most promising particles to be investigated in the search for new physics. Numerous measurements of top-quark interactions have been performed at the Tevatron and LHC since the discovery of this particle at the Tevatron in 1995. The associated production of a top quark with a photon (tγj, where j indicates a jet) via electroweak interactions provides a powerful tool to probe the couplings of the top quark with the photon and the couplings of the W boson with the photon. The small production rate of the tγj process at the LHC makes its observation very challenging. However, any excess observed above the Standard Model (SM) rate would indicate new physics.

The CMS collaboration has released evidence for the tγj process using events with one isolated muon, a photon and jets in the final state. The results are based on proton–proton collision data recorded in 2016 at a centre-of-mass-energy of 13 TeV. The tγj process results in an interesting final state, which requires information from all sub-detectors of the CMS experiment, from the innermost tracker layers to the outermost muon systems. 

The predicted cross section for tγj, including the branching fraction, is 81 fb, which corresponds to a few hundred events in the whole dataset. Therefore, a sophisticated method is needed to separate the signal events from the huge number of background events originating from several other SM processes. In addition, to achieve the highest signal-to-background ratio, a robust multivariate technique is used to estimate the contribution of the background in which a jet is misidentified as a photon. After these methods are employed, the largest background contribution comes from events that contain a top-quark pair associated with a photon.

CMS observed an excess of tγj events over the background-only hypothesis with a significance of 4.4 standard deviations, which corresponds to a p-value of 4.3 × 10–6. The measured value of the signal cross section in the considered phase space is 115 ± 34 fb. The measurement is in agreement with the SM prediction within one standard deviation. This result is the first experimental evidence of the direct production of a top quark and a photon. Upcoming results, exploiting the full 13 TeV dataset, will further improve the precision of the measurement.

bright-rec iop pub iop-science physcis connect