Comsol -leaderboard other pages

Topics

Gauge–gravity duality opens new horizons

What, in a nutshell, did you uncover in your famous 1997 work, which became the most cited in high-energy physics?

Juan Maldacena

The paper conjectured a relation between certain quantum field theories and gravity theories. The idea was that a strongly coupled quantum system can generate complex quantum states that have an equivalent description in terms of a gravity theory (or a string theory) in a higher dimensional space. The paper considered special theories that have lots of symmetries, including scale invariance, conformal invariance and supersymmetry, and the fact that those symmetries were present on both sides of the relationship was one of the pieces of evidence for the conjecture. The main argument relating the two descriptions involved objects that appear in string theory called D-branes, which are a type of soliton. Polchinski had previously given a very precise description for the dynamics of D-branes. At low energies a soliton can be described by its centre-of-mass position: if you have N solitons you will have N positions. With D-branes it is the same, except that when they coincide there is a non-Abelian SU(N) gauge symmetry that relates these positions. So this low-energy theory resembles the theory of quantum chromodynamics, except that with N colours and special matter content.

On the other hand, these D-brane solitons also have a gravitational description, found earlier by Horowitz and Strominger, in which they look like “black branes” – objects similar to black holes but extended along certain spatial directions. The conjecture was simply that these two descriptions should be equivalent. The gravitational description becomes simple when N and the effective coupling are very large.

Did you stumble across the duality, or had you set out to find it?

It was based on previous work on the connection between D-branes and black holes. The first major result in this direction was the computation of Strominger and Vafa, who considered an extremal black hole and compared it to a collection of D-branes. By computing the number of states into which these D-branes can be arranged, they found that it matched the Bekenstein–Hawking black-hole entropy given in terms of the area of the horizon. Such black holes have zero temperature. By slightly exciting these black holes some of us were attempting to extend such results to non-zero temperatures, which allowed us to probe the dynamics of those nearly extremal black holes. Some computations gave similar answers, sometimes exactly, sometimes up to coefficients. It was clear that there was a deep relation between the two, but it was unclear what the concrete relation was. The gravity–gauge (AdS/CFT) conjecture clarified the relationship.

Are you surprised by its lasting impact?

Yes. At the time I thought that it was going to be interesting for people thinking about quantum gravity and black holes. But the applications that people found to other areas of physics continue to surprise me. It is important for understanding quantum aspects of black holes. It was also useful for understanding very strongly coupled quantum theories. Most of our intuition for quantum field theory is for weakly coupled theories, but interesting new phenomena can arise at strong coupling. These examples of strongly coupled theories can be viewed as useful calculable toy models. The art lies in extracting the right lessons from them. Some of the lessons include possible bounds on transport, a bound on chaos, etc. These applications involved a great deal of ingenuity since one has to extract the right lessons from the examples we have in order to apply them to real-world systems.

What does the gravity–gauge duality tell us about nature, given that it relates two pictures (e.g. involving different dimensionalities of space) that have not yet been shown to correspond to the physical world?

It suggests that the quantum description of spacetime can be in terms of degrees of freedom that are not localised in space. It also says that black holes are consistent with quantum mechanics, when we look at them from the outside. More recently, it was understood that when we try to describe the black-hole interior, then we find surprises. What we encounter in the interior of a black hole seems to depend on what the black hole is entangled with. At first this looks inconsistent with quantum mechanics, since we cannot influence a system through entanglement. But it is not. Standard quantum mechanics applies to the black hole as seen from the outside. But to explore the interior you have to jump in, and you cannot tell the outside observer what you encountered inside.

One of the most interesting recent lessons is the important role that entanglement plays in constructing the geometry of spacetime. This is particularly important for the black-hole interior.

I suspect that with the advent of quantum computers, it will become increasingly possible to simulate these complex quantum systems that have some features similar to gravity. This will likely lead to more surprises.

In what sense does AdS/CFT allow us to discuss the interior of a black hole?

It gives us directly a view of a black hole from the outside, more precisely a view of the black hole from very far away. In principle, from this description we should be able to understand what goes on in the interior. While there has been some progress on understanding some aspects of the interior, a full understanding is still lacking. It is important to understand that there are lots of weird possibilities for black-hole interiors. Those we get from gravitational collapse are relatively simple, but there are solutions, such as the full two-sided Schwarzschild solution, where the interior is shared between two black holes that are very far away. The full Schwarzschild solution can therefore be viewed as two entangled black holes in a particular state called the thermofield double, a suggestion made by Werner Israel in the 1970s. The idea is that by entangling two black holes we can create a geometric connection through their interiors: the black holes can be very far away, but the distance through the interior could be very short. However, the geometry is time-dependent and signals cannot go from one side to the other. The geometry inside is like a collapsing wormhole that closes off before a signal can go through. In fact, this is a necessary condition for the interpretation of these geometries as entangled states, since we cannot send signals using entanglement. Susskind and myself have emphasised this connection via the “ER=EPR” slogan. This says that EPR correlations (or entanglement) should generally give rise to some sort of “geometric” connection, or Einstein–Rosen bridge, between the two systems. The Einstein–Rosen bridge is the geometric connection between two black holes present in the full Schwarzschild solution.

Are there potential implications of this relationship for intergalactic travel?

Gao, Jafferis and Wall have shown that an interesting new feature appears when one brings two entangled black holes close to each other. Now there can be a direct interaction between the two black holes and the thermofield double state can be close to the ground state of the combined system. In this case, the geometry changes and the wormhole becomes traversable.

One can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole

In fact, as shown by Milekhin, Popov and myself, one can find solutions of the Standard Model plus gravity that look like two microscopic magnetically charged black holes joined by a wormhole. We could construct a controllable solution only for small black holes because we needed to approximate the fermions as being massless.

If one wanted a big macroscopic wormhole where a human could travel, then it would be possible with suitable assumptions about the dark sector. We’d need a dark U(1) gauge field and a very large number of massless fermions charged under U(1). In that case, a pair of magnetically charged black holes would enable one to travel between distant places. There is one catch: the time it would take to travel, as seen by somebody who stays outside the system, would be longer than the time it takes light to go between the two mouths of the wormhole. This is good, since we expect that causality should be respected. On the other hand, due to the large warping of the spacetime in the wormhole, the time the traveller experiences could be much shorter. So it seems similar to what would be experienced by an observer that accelerates to a very high velocity and then decelerates. Here, however, the force of gravity within the wormhole is doing the acceleration and deceleration. So, in theory, you can travel with no energy cost.

How does AdS/CFT relate to broader ideas in quantum information theory and holography?

Quantum information has been playing an important role in understanding how holography (or AdS/CFT) works. One important development is a formula, due to Ryu and Takayanagi, for the fine-grained entropy of gravitational systems, such as a black hole. It is well known that the area of the horizon gives the coarse-grained, or thermodynamic, entropy of a black hole. The fine-grained entropy, by contrast, is the actual entropy of the full quantum density matrix describing the system. Surprisingly, this entropy can also be computed in terms of the area of the surface. But it is not the horizon, it is typically a surface that lies in the interior and has a minimal area. 

If you could pick any experiment to be funded and built, what would it be?

Well, I would build a higher energy collider, of say 100 TeV, to understand better the nature of the Higgs potential and look for hints of new physics. As for smaller scale experiments, I am excited about the current prospects to manipulate quantum matter and create highly entangled states that would have some of the properties that black holes are supposed to have, such as being maximally chaotic and allowing the kind of traversable wormholes described earlier.

How close are we to a unified theory of nature’s interactions?

String theory gives us a framework that can describe all the known interactions. It does not give a unique prediction, and the accommodation of a small cosmological constant is possible thanks to the large number of configurations that the internal dimensions can acquire. This whole framework is based on Kaluza–Klein compactifications of 10D string theories. It is possible that a deeper understanding of quantum gravity for cosmological solutions will give rise to a probability measure on this large set of solutions that will allow us to make more concrete predictions.Matth

Three neutrinos and beyond

Jean Tran Thanh Van

Since 1993 the Rencontres du Vietnam have fostered exchanges between scientists in the Asia-Pacific region and colleagues from other parts of the world. The 15th edition, which brought together more than 50 physicists in Quy Nhon, Vietnam from 4–10 August, celebrated the 30th anniversary of the start of the Large Electron Positron collider (LEP) in 1989, which within a mere three weeks of running had established that the number of species of light active neutrinos is three (CERN Courier September/October 2019 p32). This was a great opportunity to emphasise the important role that colliders have played and will continue to play in neutrino physics. Before the three-neutrino measurement of LEP, the tau neutrino had been established in the years 1975–1986 by a combination of e+e-collider observations of tau decays, pp collisions (the W → τντ decay was observed at CERN in 1985) and neutrino-beam experiments, where it was observed that taus are never produced by electron– or muon–neutrino beams (for example Fermilab’s E531 experiment in 1986).

A neutrino-oscillation industry then sprang into being, following the discovery that neutrinos have mass. An abundance of recent results on oscillation parameters were presented from accelerator-neutrino beams, nuclear reactors, and atmospheric and astrophysical neutrinos. Interestingly, the data now seem to indicate at > 3σ that neutrinos follow the natural (rather than inverted) mass ordering, in which the most electron-like neutrino has a mass smaller than that of the muon and tau neutrinos. The next 10 years should see this question resolved, as well as a determination of the CP-violating phase of the neutrino-mixing matrix, with a precision of 5–10 degrees.

The fact that neutrinos have mass requires an addition to the Standard Model (SM), wherein neutrinos are massless by definition. There are several solutions, of which a minimal modification is to introduce right-handed neutrinos in addition to the normal ones, which have left-handed chirality. The properties of these heavy neutral leptons would be very well predicted were it not that their mass can lie anywhere from less than an eV to 1010 GeV or more. Being sterile they only couple to SM particles via mixing with normal neutrinos. Consequently, they should be very rare and have long lifetimes, perhaps allowing a spectacular observation in either fixed-target or high-luminosity colliders at the electroweak scale. One possible low-energy indication could be the existence of neutrinoless double-beta decay. Such decays, currently being searched for directly in dedicated experiments worldwide, violate lepton–number conservation in the case where neutrinos possess a Majorana mass term that transforms neutrinos into antineutrinos.

For a fortunate combination of parameters, this could lead to a spectacular signature

The meeting reviewed the status of all aspects of massive neutrinos, from direct mass measurements of the sort successfully executed shortly after the conference by the KATRIN experiment (see KATRIN sets first limit on neutrino mass) to the search for heavy sterile neutrinos in ATLAS and CMS. A new feature of the field is the abundance of experimental projects searching for very weakly, or, to use the newly coined parlance, “feebly”, interacting particles. These range from CERN’s SHiP experiment to future LHC projects such as FASER and Mathusla (for masses from the pion to the B meson); proposed high-luminosity and high-energy colliders such as the Future Circular Collider would extend the search up to the Z mass for mixings between the heavy and light neutrinos down to 10–11. Until recently classified as exotic, these experiments could yield the long-sought-after explanation for the matter–antimatter asymmetry of the universe by combining CP violation with an interaction that transforms particles into antiparticles. For a fortunate combination of parameters, this could lead to a spectacular signature: the production of a heavy neutrino in a W decay, tagged by an associated charged lepton, and followed by its transformation into its antineutrino, which could then be identified by its decay into a lepton of the same sign as that initially tagged (and possibly of a different flavour).

The meeting was thus concluded in continuity with its initial commemoration: could the physics of neutrinos be one of the highlights of future high-energy colliders?

Lepton–photon interactions in Toronto

Lepton–Photon 2019

The 29th International Symposium on Lepton–Photon Interactions at High Energies was held in Canada from 5–10 August at the Westin Harbour Castle hotel, right on the Lake Ontario waterfront in downtown Toronto. Almost 300 delegates provided a snapshot of the entire field of particle physics and, for the first time, parallel sessions were convened from abstracts submitted by collaborations and individuals.

The symposium opened with a welcome from Chief Laforme of the Mississauga First Nation. It was followed by highlights from the LHC experiments and updates on plans for the CERN accelerator complex, the CEPC project in China and the recently inaugurated Belle II programme in Japan. The Belle-II collaboration showed early results from their first 6.5 fb–1 of SuperKEKb data, including measurements of previously studied Standard Model (SM) phenomena and a new limit on dark-photon production near 10 GeV. Further plenary sessions covered dark-matter searches, multi-messenger astronomy, Higgs, electroweak and top-quark physics, heavy-ion physics, QCD, exotic-particle searches, flavour physics and neutrino physics.

Tatsuya Nakada offered his views on flavour factories

The symposium ended with a progress report on the European strategy for particle physics and summaries on advances in particle detection and instrumentation, followed by a presentation on outreach and education initiatives from Kétévi Assamagan (Witwatersrand and BNL), and perspectives on future facilities. In the discussion on future flavour facilities, Tatsuya Nakada (EPFL) offered his views on flavour factories, emphasising their important role in guiding future experiments. He stressed the fact that yesterday’s discoveries (most recently the Higgs boson) become today’s workhorses, providing stringent tests of the SM. In the coming decades we are likely to have W and Higgs factories that will further illuminate the remaining shadows in the SM.

A packed public lecture by 2015 Nobel-Prize winner Art McDonald demonstrated the keen interest of the broader public in the continued developments in particle physics, including those in Canada at the SNOLAB underground laboratory, which now hosts several experiments engaged in neutrino physics and dark-matter searches, following the seminal results from the SNO experiment.

TAUP tackles topical questions

Guido Drexlin

The 16th International Conference on Topics in Astroparticle and Underground Physics (TAUP 2019) was held in Japan from 9–13 September, attracting a record 540 physicists from around 30 countries. The 2019 edition of the series, which covered recent experimental and theoretical developments in astroparticle physics, was hosted by the Institute for Cosmic Ray Research of the University of Tokyo, and held in Toyama – the gateway city to the Kamioka experimental site.

Discussions first focused on gravitational-wave observations. During their first two observing runs, reported Patricia Schmidt from Radboud University, LIGO and Virgo confidently detected gravitational waves from 10 binary black-hole coalescenses and one binary neutron star inspiral, seeing one gravitational-wave event every 15 days of observation. It was also reported that, during the ongoing third observing run, LIGO and Virgo have already observed 26 candidate events. Among them is the first signal from a black hole–neutron star merger.

Guido Drexlin revealed the first measurement results on the upper limit of the neutrino mass

The programme continued with presentations from various research fields, a highlight being a report on the first result of the KATRIN experiment (KATRIN sets first limit on neutrino mass). Co-spokesperson Guido Drexlin revealed the first measurement results on the upper limit of the neutrino mass: < 1.1 eV at 90% confidence. This world-leading direct limit – which measures the neutrino mass by precisely measuring the kinematics of the electrons emitted from tritium beta decays – was obtained based on only four weeks of data. With the continuation of the experiment, it is expected that the limit will be reduced further, or even – if the neutrino mass is sufficiently large – the actual mass will be determined. Due to their oscillatory nature, it has been known since 1998 that neutrinos have tiny, but non-zero, masses. However, their absolute values have not yet been measured.

Diversity is a key feature of the TAUP conference. Topics discussed included cosmology, dark matter, neutrinos, underground laboratories, new technologies, gravitational waves, high-energy astrophysics and cosmic rays. Multi-messenger astronomy – which combines information from gravitational-wave observation, optical astronomy, neutrino detection and other electromagnetic signals – is quickly becoming established and is expected to play an even more important role in the future in gaining a deeper understanding of the universe.

The next TAUP conference will be held in Valencia, Spain, from 30 August to 3 September 2021.

Odessa conference surveys new trends

Maciej Trzebinski

The 2019 edition of New Trends in High Energy Physics took place in Odessa, Ukraine, from 12 to 18 May, with 84 participants attending from 21 countries. Initiated by the Bogolyubov Institute for Theoretical Physics at the National Academy of Sciences in the Ukraine and the Joint Institute for Nuclear Research (JINR) in Dubna, the series focuses on new ideas and hot problems in theory and experiment. The series started in 1992 in Kiev under the name HADRONS, changed its title to “New Trends in High-Energy Physics” at the turn of the millennium, took place for a decade in the Crimea, then moved to Natal (Brazil) and Becici (Montenegro), before coming back to Ukraine this year.

This year’s conference had an emphasis on heavy-ion physics and strong interactions, with aspects of the QCD phase diagram such as signatures of the transition from quark–gluon plasma to hadrons highlighted in several talks. The interpretation of recent experimental results on collectivity (the bulk motion of nuclear matter at high temperatures) in terms of the formation of a “perfect liquid” was also discussed. Future searches for glueballs and other exotic hadronic states will contribute to an improved understanding of non-perturbative aspects of QCD.

Many problems of low and intermediate energy physics are still unresolved

Parallel to the quest for the highest possible energies, many problems of low- and intermediate-energy physics are still unresolved, such as the critical behaviour of excited baryonic matter, the nature of exotic resonances and puzzles relating to spin. The construction of new facilities will help answer these questions, with high-luminosity collisions of particles ranging from polarised protons to gold ions at JINR–Dubna’s NICA facility, complemented by fixed-target antiproton and ion studies with unprecedented collision rates at FAIR, the new international accelerator complex at GSI Darmstadt.

Talks on general relativity and cosmology, dark matter and black holes explored the many facets of modern astrophysical observations. Future multi-messenger observations, combining the measurements of the electromagnetic radiation spectrum and neutrinos with gravitational wave signals, are expected to contribute significantly to an improved understanding of the dynamics of binary black-hole and neutron-star mergers. Such measurements are of great significance for a variety of open issues, for example, nuclear physics at densities far beyond the regime accessible in laboratory experiments.

The next edition of the conference will be held in Kiev from 27 June to 3 July 2021.

Λ-hyperon anomaly confirmed

CLAS detector

A team of researchers from the UK, Germany and the US has used data from the CLAS experiment at Jefferson Laboratory to confirm an anomalous measurement of ⍺ — a key parameter in the theoretical description of the non-leptonic decays of Λ hyperons. ⍺ describes the interference of parity-conserving and parity-violating amplitudes in the matrix element of the decay Λ → pπ, and its Particle Data Group listing had remained unchanged for over 40 years. The new value will have consequences for heavy-ion physics, measurements of the transverse polarisation of Λ hyperons, the decays of heavier strange baryons, and kaon production.

Prior to this year, the best measurement of ⍺ was derived from πp→ΛK0 interactions using liquid-hydrogen targets dating from the early 1970s. In May, however, the BESIII collaboration in Beijing published a new measurement ⍺ = 0.750 ± 0.009 (stat) ± 0.004 (syst) based on observations of the decays of ΛΛ pairs from electron-positron collisions at the J/ψ resonance. The collaboration also reported a measurement of the corresponding parameter ⍺+ = −0.758 ± 0.010 (stat) ± 0.007 (syst) for the charge-conjugate decay Λ → p̄π+, consistent with the conservation of CP symmetry — the most sensitive test with Λ baryons so far. The BESIII value is 17% (corresponding to more than five standard deviations) above the previously accepted value, ⍺ = 0.642 ± 0.013. “We suspect previous experiments underestimated some systematic biases in their analyses,” says BESIII spokesperson Yuan Changzheng, of the Institute for High-Energy Physics in Beijing.

David Ireland of the University of Glasgow, and colleagues at George Washington University, the University of Bonn and Forschungszentrum Jülich, have now confirmed the BESIII measurement using kaon photo-production (γp→KΛ) data from the CLAS detector, which operated from 1998 to 2012. Their analysis exploited CLAS measurements of polarisation observables that describe the decay of the recoiling Λ→ pπ to infer the value of α using a theoretical tool known as Fierz identities. The value found, α = 0.721 ± 0.006 (stat) ± 0.005 (syst), is near to, but noticeably below, the BESIII value.

Any experiment that has used this value as part of their analysis should look again

David Ireland

“These data were not measured specifically to evaluate ⍺,” said Ireland, a former spokesperson of CLAS, “but when the BES result was reported, we realised that they represented a unique opportunity to make an independent estimate of the decay parameter.” Any experiment that has used this value as part of their analysis should look again, he continued. “It would be sensible for the time being to use both the BES and CLAS results to give a range of possible systematic uncertainty.

The new analysis confirms the BESIII result “very nicely”, concurs Yuan, given that it is based on completely different data and a different technique. “BESIII has now accumulated another 8.7 billion J/ψ events, and the same process will be analysed to
further improve the precision, both statistical and systematic.”

Exotic hadrons take centre stage in Guilin

The 18th International Conference on Hadron Spectroscopy and Structure, HADRON2019, took place in Guilin, China, from 16 to 21 August, co-hosted by the Guangxi Normal University and the Institute of Theoretical Physics of the Chinese Academy of Sciences. The conference brought together more than 330 experimental and theoretical physicists from more than 20 countries to discuss topics ranging from meson and baryon spectroscopy to nucleon structure and hypernuclei. The central issue was exotic hadrons: the strongly interacting particles that deviate from the textbook definitions of mesons and baryons. Searches for exotic hadrons and studies of their properties have been a focus for many high-energy physics experiments, and many fascinating results have been reported since 2003 when the first particles of this sort were discovered: the hidden-charm X(3872) and the open-charm Ds*0 (2317) observed by Belle and BaBar, respectively. The most cited physics papers of Belle and BESIII and the second most cited of BaBar and LHCb are reports of the discoveries of exotic hadron candidates.

The conference began with a report on LHCb measurements of the doubly charmed Ξ++cc baryon, and the discovery of pentaquark particles called Pc. The higher statistics of the LHC Run-2 data have resolved the Pc(4450) reported by LHCb in 2015 into two narrower structures, Pc(4440) and Pc(4457). In addition, a third hidden-charm pentaquark, Pc(4312), with a smaller mass, was observed for the first time. These Pc structures are very likely exotic baryons consisting of at least five quarks, including a charm quark–antiquark pair. Many theorists believe that these pentaquarks can be described as hadronic molecules of a charmed meson and a charmed baryon, analogous to the deuteron, which is a bound state of a neutron and a proton. A series of parallel talks described theoretical predictions that will be useful in motivating further measurements, such as searches for the decay to a charmed baryon and a charmed meson, and searches for the various new pentaquarks predicted by theoretical models.

The X(3872) discovered by Belle 16 years ago is still the subject of intensive investigations

Illustrating the difficulty of understanding the inner structure of hadrons, the X(3872) discovered by Belle 16 years ago is still the subject of intensive investigations. Its mass is extremely close to the sum of the masses of two charmed mesons, D0 and D*0, and its decay width (< 1.2 MeV) is anomalously small for a hadron of such a mass. New results on its decays into lighter particles were reported by BESIII. Alongside proposals for precise measurements of its mass, width and polarisation at Belle-II, PANDA and the LHC experiments, a deeper understanding of the X(3872) may be just around the corner. A close collaboration between experimentalists and theorists is required, and this conference provided a valuable opportunity to exchange ideas. Interesting discussions will continue at the next HADRON conference, to be held in Mexico in 2021.

Last stop for the Higgs Couplings workshop

Higgs-boson measurements are entering the precision regime, with Higgs couplings to gauge bosons now measured to better than 10% precision, and its decays to third-generation fermions measured to better than 20%. These and other recent experimental and theoretical results were the focus of discussions at the eighth international Higgs Couplings workshop, held in Oxford from 30 September to 4 October 2019. Making its final appearance with this moniker (next year it will be rebranded as Higgs 2020), the conference programme comprised 38 plenary and 46 parallel talks attended by 120 participants.

The first two days of the conference reviewed Higgs measurements, including a new ATLAS measurement of ttH production using Higgs boson decays to leptons, and a differential measurement of Higgs boson production in its decays to W-boson pairs using all of the CMS data from Run 2. These measurements showed continuing progress in coupling measurements, but the highlight of the precision presentations was a new determination of the Higgs boson mass from CMS using its decays to two photons. Combining this result with previous CMS measurements gives a Higgs boson mass of 125.35 ± 0.15 GeV/c2, corresponding to an impressive relative precision of 0.12%. From the theory side, the challenges of keeping up with experimental precision were discussed. For example, the Higgs boson production cross section is calculated to the highest order of any observable in perturbative QCD, and yet it must be predicted even more precisely to match the expected experimental precision of the HL-LHC.

ATLAS presented an updated self-coupling constraint

One of the highest priority targets of the HL-LHC is the measurement of the self-coupling of the Higgs boson, which is expected to be determined to 50% precision. This determination is based on double-Higgs production, to which the self-coupling contributes when a virtual Higgs boson splits into two Higgs bosons. ATLAS and CMS have performed extensive searches for two-Higgs production using data from 2016, and at the conference ATLAS presented an updated self-coupling constraint using a combination of single- and double-Higgs measurements and searches.  Allowing only the self-coupling to be modified by a factor ?λ in the loop corrections yields a constraint on the Higgs self-coupling of –2.3 < ?λ < 10.3 times the Standard Model prediction at 95% confidence.

The theoretical programme of the conference included an overview of the broader context for Higgs physics, covering the possibility of generating the observed matter-antimatter asymmetry through a first- order electroweak phase transition, as well as possibilities for generating the Yukawa coupling matrices. In the so-called electroweak baryogenesis scenario, the cooling universe developed bubbles of broken electroweak symmetry with asymmetric matter-antimatter interactions at the boundaries, with sphalerons in the electroweak-symmetric space converting the resulting matter asymmetry into a baryon asymmetry. The matter-asymmetric interactions could have arisen through Higgs boson couplings to fermions or gauge bosons, or through its self-couplings. In the latter case the source could be an additional electroweak singlet or doublet modifying the Higgs potential.

The broader interpretation of Higgs boson measurements and searches was discussed both in the case of specific models and in the Standard Model effective field theory, where new particles appear at significantly higher masses (~1 TeV/c2 or more). The calculations in the effective field theory continue to advance, adding higher orders in QCD to more electroweak processes, and an analytical determination of the dependence of the Higgs decay width on the theory parameters. Constraints on the number and values of these parameters also continue to improve through an expanded use of input measurements.

The conference wrapped up with a look into the crystal ball of future detectors and colliders, with a sobering yet inspirational account of detector requirements at the next generation of colliders. To solve the daunting challenges, the audience was encouraged to be creative and explore new technologies, which will likely be needed to succeed. Various collider scenarios were also presented in the context of the European Strategy update, which will wrap up early next year.

The newly minted Higgs conference will be held in late October or early November of 2020 in Stonybrook, New York.

Redeeming the role of mathematics

A currently popular sentiment in some quarters is that theoretical physics has dived too deeply into mathematics, and lost contact with the real world. Perhaps, it is surmised, the edifice of quantum gravity and string theory is in fact a contrived Rube-Goldberg machine, or a house of cards which is about to collapse – especially given that one of the supporting pillars, namely supersymmetry, has not been discovered at the LHC. Graham Farmelo’s new book sheds light on this issue.

The universe speaks in numbers, reads Farmelo’s title. With hindsight this allows a double interpretation: first, that it is primarily mathematical structure which underlies nature. On the other hand, one can read it as a caution that the universe speaks to us purely via measured numbers, and theorists should pay attention to that. The majority of physicists would likely support both interpretations, and agree that there is no real tension between them.

The author, who was a theoretical physicist before becoming an award-winning science writer, does not embark on a detailed scientific discussion of these matters, but provides a historical tour de force of the relationship between mathematics and physics, and their tightly correlated evolution. At the time of ancient Greeks there was no distinction between these fields, and it was only from about the 19th century onwards that they were viewed as separate. Evidently, a major factor was the growing role of experiments, which provided a firmer grounding in the physical world than what had previously been called natural philosophy.

Theoretical physicists should not allow themselves to be distracted by every surprising experimental finding

Paul Dirac

The book follows the mutual fertilisation of mathematics and physics through the last few centuries, as the disciplines gained momentum with Newton, and exploded in the 20th century. Along the way it peeks into the thinking of notable mathematicians and physicists, often with strong opinions. For example, Dirac, a favourite of the author, is quoted as reflecting both that “Einstein failed because his mathematical basis… was not broad enough” and that “theoretical physicists should not allow themselves to be distracted by every surprising experimental finding.” The belief that mathematical structure is at the heart of physics and that experimental results ought to have secondary importance holds sway in this section of the book. Such thinking is perhaps the result of selection bias, however, as only scientists with successful theories are remembered.

The detailed exposition makes the reader vividly aware that the relationship between mathematics and physics is a roller-coaster loaded with mutual admiration, contempt, misunderstandings, split-ups and re-marriages. Which brings us, towards the end of the book, to the current state of affairs in theoretical high-energy physics, which most of us in the profession would agree is characterised by extreme mathematical and intellectual sophistication, paired with a stunning lack of experimental support. After many decades of flourishing interplay, which provided, for example, the group-theoretical underpinning of the quark model, the geometry of gauge theories, the algebraic geometry of supersymmetric theories and finally strings, is there a new divorce ahead? It appears that some not only desire, but relish the lack of supporting experimental evidence. This concern is also expressed by the author, who criticises self-declared experts who “write with a confidence that belies the evident slightness of their understanding of the subject they are attacking”.

The last part of the book is the least readable. Based on personal interactions with physicists, the exposition becomes too detailed to be of use to the casual, or lay reader. While there is nothing wrong with the content, which is exciting, it will only be meaningful to people who are already familiar with the subject. On the positive side, however, it gives a lively and accurate snapshot of today’s sociology in theoretical particle physics, and of influential but less well known characters in the field.

The Universe Speaks in Numbers illuminates the role of mathematics in physics in an easy-to-grasp way, exhibiting in detail their interactive co-evolution until today. A worthwhile read for anybody, the book is best suited for particle physicists who are close to the field.

KATRIN sets first limit on neutrino mass

Based on just four weeks of running, researchers at the Karlsruhe Tritium Neutrino (KATRIN) experiment in Germany have set a new model-independent bound on the mass of the neutrino. At a colloquium today, the collaboration reported an upper limit of 1.1 eV at 90% confidence, almost halving the previous bound.

Neutrinos are among the least well understood particles in the Standard Model. Their three known mass eigenstates do not match up with the better-known flavour eigenstates, but mix according to the PMNS matrix, resulting in the flavour transmutations seen by neutrino-oscillation experiments. Despite their success in constraining neutrino mixing, such experiments are sensitive only to squared mass differences between the eigenstates, and not to the neutrino masses themselves.

Physicists have pursued direct mass measurements since Reines and Cowan observed electron antineutrinos in inverse beta decays in 1956. The direct mass measurement method hinges on precisely measuring the energy spectrum of beta-decay electrons, and is considered model independent as the extracted neutrino mass depends only on the kinematics of the decay. KATRIN is now the most precise experiment of this kind. It builds on the invention of gaseous molecular tritium sources and spectrometers based on the principle of magnetic adiabatic collimation with electrostatic filtering. The combination of these methods culminated in the previous best limits of 2.3 eV at 95% confidence in 2005, and 2.05 eV at 95% confidence in 2011, by physicists working in Mainz, Germany and Troitsk, Russia, respectively. The KATRIN analysis improves on these experimental results, with systematic uncertainties reduced by a factor of six and statistical uncertainties reduced by a factor of two.

These are exciting times for the collaboration

Guido Drexlin

“These are exciting times for the collaboration,” said KATRIN co-spokesperson Guido Drexlin. “The first KATRIN result is based on a measurement campaign of only four weeks at reduced source activity, equivalent to five days at nominal activity.” To reach its final sensitivity, KATRIN will collect data for 1000 days, and systematic errors will be reduced. “This will allow us to probe neutrino masses down to 0.2 eV,” continued Drexlin, “as well as many other interesting searches for beyond-the-Standard-Model physics, such as for admixtures of sterile neutrinos from the eV up to the keV scale.”

The KATRIN beamline

Conceived almost two decades ago, KATRIN operates using a high-resolution, large-acceptance and low-background measurement of the decay spectrum of tritium 3H → 3He e ν̄e. Electrons are transported to the spectrometer via a beamline that was completed in autumn 2016, allowing experimenters to search for distortions in the tail of the electron energy distribution that depend on the absolute mass of the neutrino. KATRIN collaborators are now looking forward to a two-month measurement campaign, which will start in a few days. It will feature a signal-to-background ratio that is expected to be about one order of magnitude better than the initial measurements, due to an increase in source activity, and a decrease in background due to hardware upgrades. The goal is to achieve an activity of 1011 beta-decay electrons per second, while reducing the current background level by about a factor of two.

Direct measurements are not the only handle on neutrino masses available to physicists, though they are certainly the most model independent. Experiments searching for neutrinoless double beta-decay offer a complementary limit, but must assume that the neutrino is a Majorana fermion.

The tightest limit on neutrino masses comes from cosmology. Comparing data from the Planck satellite with simulations of the development of structure in the early universe yields an upper limit on the sum of all three neutrino masses of 0.17 eV at 95% confidence.

The Planck limit is fairly robust, and one would have to go to great lengths to avoid it

Joachim Kopp

“The Planck limit is fairly robust, and one would have to go to great lengths to avoid it – but it’s not impossible to do so,” says CERN theorist Joachim Kopp. For example, it would be invalidated by a scenario where as-yet-undiscovered right-handed neutrinos couple to a new scalar field with a vacuum expectation value that evolves over cosmological timescales. “Planck data tell us what neutrinos were like in the early universe,” says Kopp. “The value of KATRIN lies in testing neutrinos now.”

bright-rec iop pub iop-science physcis connect