Comsol -leaderboard other pages

Topics

Bottomonium elliptic-flow no-show

Diagram of elliptic flow

High-energy heavy-ion collisions at the LHC give rise to a deconfined system of quarks and gluons called the quark–gluon plasma (QGP). One of its most striking features is the emergence of collective motion due to pressure gradients that develop at the centre. Direct experimental evidence for this collective motion is the observation of anisotropic flow, which translates the asymmetry of the initial geometry into a final-state momentum anisotropy. Its magnitude is quantified by harmonic coefficients vn in a Fourier decomposition of the azimuthal distribution of particles. As a result of the almond-shaped geometry of the interaction volume, the largest contribution to the asymmetry is the second coefficient, or “elliptic flow”, v2.

A positive v2 has been measured for a large variety of particles, from pions, protons and strange hadrons up to the heavier J/ψ meson. The latter is a curious case as quarkonia such as J/ψ are bound states of a heavy quark (charm or bottom) and its antiquark (CERN Courier December 2017 p11). Quarkonia constitute interesting probes of the QGP because heavy-quark pairs are produced early and experience the full evolution of the collision. In heavy-ion collisions at the LHC, charmonia, such as the J/ψ, dissociate due to screening from free colour charges in the QGP, and regenerate by the recombination of thermalised charm quarks. More massive still, and having a higher binding energy than charmonium, the dissociation of bottomonium ϒ(1S) is expected to be limited to the early stage of the collision when the temperature of the surrounding QGP medium is high. Its regeneration is not expected to be significant because of the small number of available bottom quarks.

The ALICE collaboration recently reported the first measurement of the elliptic flow of the ϒ(1S) meson in lead–lead (Pb–Pb) collisions using the full Pb–Pb data set of LHC Run 2 (figure 1). The measured values of the ϒ(1S) v2 are small and consistent with zero, making bottomonia the first hadrons that do not seem to flow in heavy-ion collisions at the LHC. Compared to the measured ν2 of inclusive J/ψ in the same centrality and pT intervals, the v2 of ϒ(1S) is lower by 2.6 standard deviations. The results are also consistent with the small, positive values predicted by models that include no or small regeneration of bottomonia by the recombination of bottom quarks interacting in the QGP.

These observations, in combination with earlier measurements of the suppression of ϒ(1S) and J/ψ, support the scenario in which charmonia dissociate and reform in the QGP, while bottomonia are dominantly dissociated at early stages of the collisions. Future datasets, to be collected during LHC runs 3 and 4 after a major upgrade of the ALICE detector, will significantly improve the quality of the present measurements.

Grappling with dark energy

Adam Riess of Johns Hopkins University

Could you tell us a few words about the discovery that won you a share of the 2011 Nobel Prize in Physics?

Back in the 1990s, the assumption was that we live in a dense universe governed by baryonic and dark matter, but astronomers could only account for 30% of matter. We wanted to measure the expected deceleration of the universe at larger scales, in the hope that we would find evidence for some kind of extra matter that theorists predicted could be out there. So, from 1994 we started a campaign to measure the distances and redshifts of type-1a supernovae explosions. The shift in a supernova’s spectrum due to the expansion of space gives its redshift, and the relation between redshift and distance is used to determine the expansion rate of the universe. By comparing the expansion rates at two different epochs of the universe, we can estimate the expansion rate of the universe and how it changes over time. We made this comparison in 1998 and, to our surprise, we found that instead of decreasing, the expansion rate was speeding up. A stronger confirmation came after combining our measurements with those of the High-z Supernova Search Team. The result could be interpreted if the universe instead of decelerating is speeding up its expansion.

What was the reaction from your colleagues when you announced your findings?

That our result was wrong! There were understandably different reactions but the fact that two independent teams were measuring an accelerating expansion rate, plus the independent confirmation from measurements of the Cosmic Microwave Background (CMB), made it clear that the universe is accelerating. We reviewed all possible sources of errors including the presence of some yet unknown astronomical process, but nothing came out. Barring a series of unrelated mistakes, we were looking at a new feature of the universe.

There were other puzzles at that time in cosmology that the idea of an accelerating universe could also solve. The so-called “age crisis” (many stars looked older than the age of the universe) was one of them. This meant that either the stellar ages are too high or that there is something wrong with the age of the universe and its expansion. This discrepancy could be resolved by accounting for an accelerated expansion.

What is driving the accelerated expansion?

One idea is that the cosmological constant, initially introduced by Einstein so that general relativity could accommodate a static universe, is linked to the vacuum energy. Today we know that the vacuum energy can’t be the final answer because summing the contributions from the presumed quantum states in the universe produces an enormous number for the expansion rate that is about 120 orders of magnitude higher than observed. This rate is so high that it would have ripped apart galaxies, stars, planets, before any structure was formed.

The accelerating expansion can be due to what we broadly refer to as dark energy, but its source and its physics remain unknown. It is an ongoing area of research. Today we are making further supernovae observations to measure even more precisely the expansion rate, which will help us to understand the physics behind it.

By which other methods can we determine the source of the acceleration?

Today there is a vast range of approaches, using both space and ground experiments. A lot of work is ongoing on identifying more supernovae and measuring their distances and redshifts with higher precision. Other experiments are also looking to baryonic acoustic oscillations that would provide a standard ruler for measuring cosmological distances in the universe. There are proposals to use weak gravitational lensing, which is extremely sensitive to the parameters describing dark energy as well as the shape and history of the universe. Redshift space distortions due to the peculiar velocities of galaxies can also tell us something. We may be able to learn something from these different types of observations in a few years. The hope is to be able to measure the equation-of-state of dark energy with a 1% precision, and its variation over time with about 10% precision. This will offer a better understanding of whether dark energy is the cosmological constant or perhaps some form of energy temporarily stored in a scalar field that could change over time.

Is this one of the topics that you are currently involved with?

Yes, among other things. I am also working on improving the precision of the measurements of the Hubble constant, Ho, which characterises the present state and expansion rate of our universe. Refined measurements of Ho could point to potential discrepancies in the cosmological model.

What’s wrong with our current determination of the Hubble constant?

The problem is that even when we account for dark energy (factoring in any uncertainties we are aware of) we get a discrepancy of about 9% when comparing the predicted expansion rate based on CMB data using the standard “ΛCDM” cosmological model with the present expansion. The uncertainty in this measurement has now gone below 2%, leading to a significance of more than 5σ while future observations from the SH0ES programme would likely reduce it to 1.5%.

A new feature in the dark sector of the universe appears increasingly necessary to explain the present tension

There is something more profound in the disagreement of these two measurements. One measures how fast the universe is expanding today, while the other is based on the physics of the early universe – taking into account a specific model – and measuring how fast it should have been expanding. If these values don’t agree, there is a very strong likelihood that we are missing something in our cosmological model that connects the two epochs in the history of our universe. A new feature in the dark sector of the universe appears in my view increasingly necessary to explain the present tension.

When did the seriousness of the H0 discrepancy become clear?

It is hard to pinpoint a date, but it was between the publication of first results from Planck in 2013, which predicted the value of H0 based on precise CMB measurements, and the publication of our 2016 paper that confirmed the H0 measurement. Since then, the tension has been growing. Various people were convinced along this way as new data came in, while there are people who are still not convinced. This diversity of opinions is a healthy sign for science: we should take into account alternative viewpoints and continuously reassess the evidence that we have without taking anything for granted.

How can the Hubble discrepancy be interpreted?

The standard cosmological model, which contains just six free parameters, allows us to extrapolate the evolution from the Big Bang to the present cosmos – period of almost 14 billion years. The model is based on certain assumptions: that space in the early universe was flat; that there are three neutrinos; that dark matter is very nonreactive; that dark energy is similar to the cosmological constant; and that there is no more complex physics. So one or perhaps a combination of these can be wrong. Knowing the original content of the universe and the physics, we should be able to measure how the universe was expanding in the past and what should be its present expansion rate. The fact that there is a discrepancy means that we don’t have the right understanding.

We think that the phenomenon that we call inflation is similar to what we call dark energy, and it is possible that there was another expansion episode in the history of the universe just after the recombination period. Certain theories predict a form of “early dark energy” becomes significant giving a boost to the universe that matches our current observations. Another option is the presence of dark radiation: a term that could account for a new type of neutrino or for another relativistic particle present in the early history of the universe. The presence of dark radiation would change the estimate of the expansion rate before the recombination period and gives us a way to address the current Hubble-constant problem. Future measurements could tell us if other predictions of this theory are correct or not.

Does particle physics have a complementary role to play?

Oh definitely. Both collider and astrophysics experiments could potentially reveal either the properties of dark matter or a new relativistic particle or something new that could change the cosmological calculations. There is an overlap concerning the contributions of these fields in understanding the early universe, a lot of cross-talk and blurring of the lines – and in my view, that’s healthy.

What has it been like to win a Nobel prize at the relatively early age of 42?

It has been a great honour. You can choose whether you want to do science or not, as long as this choice is available. So certainly, the Nobel is not a curse. Our team is continually trying to refine the supernovae measurements, while this is a growing community. Hopefully, if you come back in a couple of years, we will have more answers to your questions.

Galaxies thrive on new physics

This supercomputer-generated image of a galaxy suggests that general relativity might not be the only way to explain how gravity works. Theorists at Durham University in the UK simulated the universe using hydrodynamical simulations based on “f(R) gravity” – in which a scalar field enhances gravitational forces in low-density regions (such as the outer parts of a galaxy) but is screened by the so-called chameleon mechanism in high-density environments such as our solar system (see C Arnold et al. Nature Astronomy; arXiv:1907.02977).

The left-hand side of the image shows the scalar field of the theory: bright-yellow regions correspond to large scalar-field values, while dark-blue regions correspond to to a very small scalar fields, i.e. regions where screening is active and the theory behaves like general relativity. The right-hand side of the image shows the gas density with stars overplotted. The simulation, which was based on a total of 12 simulations for different model parameters and resolutions, and which required a total runtime of about 2.5 million core-hours, shows that spiral galaxies like our Milky Way could still form even with different laws of gravity.

“Our research definitely does not mean that general relativity is wrong, but it does show that it does not have to be the only way to explain gravity’s role in the evolution of the universe,” says lead author Christian Arnold of Durham University’s Institute for Computational Cosmology.

Interdisciplinary physics at the AEDGE

Frequency niche

Following the discovery of gravitational waves by the LIGO and Virgo collaborations, there is great interest in observing other parts of the gravitational-wave spectrum and seeing what they can tell us about astrophysics, particle physics and cosmology. The European Space Agency (ESA) has approved the LISA space experiment that is designed to observe gravitational waves in a lower frequency band than LIGO and Virgo, while the KAGRA experiment in Japan, the INDIGO experiment in India and the proposed Einstein Telescope (ET) will reinforce LIGO and Virgo. However, there is a gap in observational capability in the intermediate-frequency band where there may be signals from the mergers of massive black holes weighing between 100 and 100,000 solar masses, and from a first-order phase transition or cosmic strings in the early universe.

This was the motivation for a workshop held at CERN on 22 and 23 July that brought experts from the cold-atom community together with particle physicists and representatives of the gravitational-wave community. Experiments using cold atoms as clocks and in interferometers offer interesting prospects for detecting some candidates for ultralight dark matter as well as gravitational waves in the mid-frequency gap. In particular, a possible space experiment called AEDGE could complement the observations by LIGO, Virgo, LISA and other approved experiments.

The workshop shared information about long-baseline terrestrial cold-atom experiments that are already funded and under construction, such as MAGIS in the US, MIGA in France and ZAIGA in China, as well as ideas for future terrestrial experiments such as MAGIA-advanced in Italy, AION in the UK and ELGAR in France. Delegates also heard about space – CACES (China) and CAL (NASA) – and sounding-rocket experiments – MAIUS (Germany) – using cold atoms in space and microgravity.

A suggestion for an atom interferometer using a pair of satellites is being put forward by the AEDGE team

ESA has recently issued a call for white papers for its Voyage 2050 long-term science programme, and a suggestion for an atom interferometer using a pair of satellites is being put forward by the AEDGE team (in parallel with a related suggestion called STE-QUEST) to build upon the experience with prior experiments. AEDGE was the focus of the CERN workshop, and would have unique capabilities to probe the assembly of the supermassive black holes known to power active galactic nuclei, physics beyond the Standard Model in the early universe and ultralight dark matter. AEDGE would be a uniquely interdisciplinary space mission, harnessing cold-atom technologies to address key issues in fundamental physics, astrophysics and cosmology.

Higgs hunters still hungry in Paris

Participants at Higgs Hunting 2019

The 10th Higgs Hunting workshop took place in Orsay and Paris from 29–31 July, attracting 110 physicists for lively discussions about recent results in the Higgs sector. The ATLAS and CMS collaborations presented Run 2 analyses with up to 140 fb–1 of data collected at a centre-of-mass energy of 13 TeV. The statistical uncertainty on some Higgs properties, such as the production cross-section, has now been reduced by a factor three compared to Run 1. This puts some Higgs studies on the verge of being dominated by systematic uncertainties. By the end of the LHC’s programme, measurements of the Higgs couplings to the photon, W, Z, gluon, tau lepton and top and bottom quarks are all expected to be dominated by theoretical rather than statistical or experimental uncertainties.

Several searches for additional Higgs bosons were presented. The general recipe here is to postulate a new field in addition to the Standard Model (SM) Higgs doublet, which in the minimal case yields a lone physical Higgs universally associated with the particle discovered at the LHC with a mass of 125 GeV in 2012. Adding a hypothetical additional Higgs doublet, however, as in the two Higgs doublet model, would yield five physical states: CP-even neutral Higgs bosons h and H, the CP-odd pseudoscalar A, and two charged Higgs bosons H±; the model would also bequeath three additional free parameters. Other models discussed at Higgs Hunting 2019 include the minimal and next-to-minimal supersymmetric SMs and extra Higgs states with doubly charged Higgs bosons. Anna Kaczmarska from ATLAS and Suzanne Gascon-Shotkin from CMS described direct searches for such additional Higgs bosons decaying to SM particles or Higgs bosons. Loan Truong from ATLAS and Yuri Gershtein from CMS described studies of rare – and potentially beyond-SM – decays of the 125 GeV Higgs boson. No significant excesses were reported, but hope remains for Run 3, which will begin in 2021.

Nobel laureate Gerard ’t Hooft gave a historical talk on the role of the Higgs in the renormalisation of electroweak theory, recalling the debt his Utrecht group, where the work was done almost 50 years ago, owed to pioneers like Faddeev and Popov. Seven years after the particle’s discovery, we now know it to be spin-0 with mainly CP-even interactions with bosons, remarked Fabio Cerutti of Berkeley in the experimental summary. With precision on the Higgs mass now better than two parts per mille, all of the SM’s free parameters are known with high precision, he continued, and all but three of them are linked to Higgs-boson interactions.

Give me six hours to chop down a tree and I will spend the first four sharpening the axe.

Abraham Lincoln

Hunting season may now be over, Cerutti concluded, but the time to study Higgs anatomy and exploit the 95% of LHC data still to come is close at hand. Giulia Zanderighi’s theory summary had a similar message: Higgs studies are still in their infancy and the discovery of what seems to be a very SM-like Higgs at 125 GeV allows us to explore a new sector with a broad experimental programme that will extend over decades. She concluded with a quote from Abraham Lincoln: “Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

The next Higgs Hunting workshop will be held in Orsay and/or Paris from 7–9 September 2020.

PANIC 2020 – The 22nd Particle and Nuclei International Conference

ICHEP

Neutrino 2020 – International Conference on Neutrino physics

MESON2020 – 16th International Workshop on Meson Physics

The Institute of Physics of Jagiellonian University, Forschungszentrum Jülich, INFN-LNF Frascati and Institute of Nuclear Physics PAS Cracow are organizing a biennial workshop to establish closer contacts between experimentalists and theorists involved in the studies of meson production, properties and interaction. The workshop will cover lectures on both experimental and theoretical aspects, in particular the presentation of new results.

The main topics of the workshop are:

  • hadronic and electromagnetic meson production,
  • meson interaction with mesons, baryons, ground state nuclei as well as
    hot and dense nuclear matter,
  • structure of hadrons,
  • precision measurements as tests of fundamental symmetries,
  • exotic systems in QCD,
  • novel approaches in theory and experiment.

The intention is to provide an overview of the present status in these fields, as well as of new developments, and a preview of the forthcoming investigations. This workshop – the sixteenth of the series – will maintain the tradition of the workshops organized since 1991 at Cracow.

Inflationary Reheating Meets Particle Physics Frontier

This conference will bring together experts in cosmology, particle physics, and fundamental theory to address how and when the universe thermalizes following inflation, and the associated particle physics and dark matter phenomenology. Important topics that will be covered include hidden sector model building in the LHC era, thermalization of the universe following inflation, possibilities of post-inflation cosmic history prior to nucleosynthesis, and associated experimental signatures. The conference aims to attract researchers in different areas to develop new directions in model building and establish new experimental paths for probing early universe cosmology and dark matter phenomenology.

bright-rec iop pub iop-science physcis connect