Bluefors – leaderboard other pages

Topics

European computing cloud takes off

A European scheme to make publicly funded scientific data openly available has entered its first phase of development, with CERN one of several organisations poised to test the new technology. Launched in January and led by the UK’s Science and Technology Facilities Council, a €10 million two-year pilot project funded by the European Commission marks the first step towards the ambitious European Open Science Cloud (EOSC) project. With more than 30 organisations involved, the aim of the EOSC is to establish a Europe-wide data environment to allow scientists across the continent to exchange and analyse data. As well as providing the basis for better scientific research and making more efficient use of data resources, the open-data ethos promises to address societal challenges such as public-health or environmental emergencies, where easy access to reliable research data may improve response times.

The pilot phase of the EOSC aims to establish a governance framework and build the trust and skills required. Specifically, the pilot will encourage selected communities to develop demonstrators to showcase EOSC’s potential across various research areas including life sciences, energy, climate science, material science and the humanities. Given the intense computing requirements of high-energy physics, CERN is playing an important role in the pilot project.

The CERN demonstrator aims to show that the basic requirements for the capture and long-term preservation of particle-physics data, documentation, software and the environment in which it runs can be satisfied by the EOSC pilot. “The purpose of CERN’s involvement in the pilot is not to demonstrate that the EOSC can handle the complex and demanding requirements of LHC data-taking, reconstruction, distribution, re-processing and analysis,” explains Jamie Shiers of CERN’s IT department. “The motivation for long-term data preservation is for reuse and sharing.”

Propelled by the growing IT needs of the LHC and experience gained by deploying scientific workloads on commercial cloud services, explains Bob Jones of CERN’s IT department, CERN proposed a model for a European science cloud some years ago. In 2015 this model was expanded and endorsed by members of EIROforum. “The rapid expanse in the quantities of open data being produced by science is stretching the underlying IT services,” says Jones. “The Helix Nebula Science Cloud, led by CERN, is already working with leading commercial cloud service providers to support this growing need for a wide range of scientific use cases.”

The challenging EOSC project, which raises issues such as service integration, intellectual property, legal responsibility and service quality, complements the work of the Research Data Alliance and builds on the European Strategy Forum on Research Infrastructure (ESFRI) road map. “Our goal is to make science more efficient and productive and let millions of researchers share and analyse research data in a trusted environment across technologies, disciplines and borders,” says Carlos Moedas, EC commissioner for research, science and innovation.

Milestone for US dark-matter detector

The US Department of Energy (DOE) has formally approved a key construction milestone for the LUX-ZEPLIN (LZ) experiment, propelling the project towards its April 2020 goal for completion. On 9 February the project passed a DOE review and approval stage known as “Critical Decision 3”, which accepts the final design and formally launches construction. The LZ detector, which will be built roughly 1.5 km underground at the Sanford Underground Research Facility in South Dakota and be filled with 10 tonnes of liquid xenon to detect dark-matter interactions, is considered one of the best bets to determine whether dark-matter candidates known as WIMPs exist.

The project stems from the merger of two previous experiments: LUX (Large Underground Xenon) and ZEPLIN (ZonEd Proportional scintillation in LIquid Noble gases). It was first approved in 2014 and currently has about 250 participating scientists in 37 institutions in the US, UK, Portugal, Russia and Korea. The detector is expected to be at least 50 times more sensitive to finding signals from dark-matter particles than its predecessor LUX, and will compete with other liquid-xenon experiments under development worldwide in the race to detect dark matter. A planned upgrade to the current XENON1T experiment (called XENONnT) at Gran Sasso National Laboratory in Italy and China’s plans to advance the PandaX-II detector, for instance, are both expected to have a similar schedule and scale to LZ.

The LZ collaboration plans to release a Technical Design Report later this year. “We will try to go as fast as we can to have everything completed by April 2020,” says LZ project director Murdock Gilchriese. “We got a very strong endorsement to go fast and to be first.”

European organisations uphold scientific values

More than 50 science organisations in Europe have written an open letter expressing concern about the impact of recent US policies on science, research and innovation. The 10 February letter, which was organised by EuroScience (founder of the EuroScience Open Forum, ESOF), asks that the principles and values that underpin scientific progress are upheld. It is addressed to the presidents of the European Council and European Commission, and prime ministers and science ministers in individual European countries.

The European Physical Society (EPS) is among the many signatories of the letter, as are the Marie Curie Alumni Association, the Royal Society and the Royal Swedish Academy of Sciences. Explaining the decision to sign, outgoing EPS president Christophe Rossel says: “Science was and will never be restrained by physical, cultural and political barriers. In our globalised world, where international scientific collaboration has become the rule, there is no place for discrimination and censorship. Any measure that restricts the freedom of movement and communication of our US colleagues will have a profound impact on science and innovation in Europe and other continents.”

Three chief concerns are outlined in the letter: the recent Executive Order discriminating against persons on the basis of their nationality; indications that US government scientists might be affected by new policies that limit their communication with the press; and the unwarranted credibility given to views that are not based on facts and sound evidence in areas such as climate science. It states that all of these are at odds with the principles of transparency, open communication and the mobility of scholars and scientists, “which are vital to scientific progress and to the benefit of our societies, economies and cultures deriving from it”.

Chamonix event prepares for LHC’s future

2016 was a remarkably successful year for CERN’s Large Hadron Collider (LHC), marked by excellent peak performance, good availability and operational flexibility (CERN Courier December 2016 p5). Targeting further improvement, a thorough review of LHC operation and system performance was the focus of discussions in the first phase of the annual LHC performance workshop, which took place from 23 to 26 January in Chamonix, France.

Experts from the accelerator sector, CERN management and members of the CERN Machine Advisory Committee explored the operational scenarios for the remainder of Run 2 and made preliminary decisions regarding optics and machine parameters. Beam is due back in the LHC this year at the beginning of May, and the rest of the year will essentially be dedicated to proton–proton physics, with the usual mix of machine development and special physics runs. By quantifying the limitations to peak luminosity from electron-cloud effects, the cryogenics system and other factors, luminosity estimates for the coming years were also drawn up: in 2017, the peak luminosity should be at least 1.7 × 1034 cm–2 s–1 and the integrated luminosity target for ATLAS and CMS is 45 fb–1.

One open question about future LHC operations concerns the increase of the beam energy from 6.5 to 7 TeV per beam, which would see the machine reach its design specification. To gain input on high-field magnet behaviour, a dipole training campaign was conducted at the start of the year-end technical stop (CERN Courier March 2017 p9). Experience from this and previous training campaigns was reviewed and the duration, timing and associated risks of pushing up to 7 TeV – including implications for other accelerator systems, such as the LHC beam dump – were explored. There will be no change of beam energy in 2017 and 2018. The goal is to prepare the LHC to run at 14 TeV during Run 3 with the experiments expressing a clear preference to make the change in energy in a single step.

Regarding the longer-term future of the LHC, the High-Luminosity LHC (HL-LHC) demands challenging proton and ion beam parameters from the injector complex. The LHC injector upgrade (LIU) project is charged with planning and executing wide-ranging upgrades to the complex to meet these requirements. Both the LIU and HL-LHC projects have come through a recent cost-and-schedule review, and at present are fully funded and on schedule. The injector upgrades will be deployed during Long Shutdown 2 (LS2) in 2019/2020, while the HL-LHC will see the major part of its upgrades implemented in LS3, which is due to start in 2024.

With only two more years of operation before the next long shutdown, planning for LS2 is already well advanced. For the LHC itself, LS2 will not require the same level of intervention as seen in LS1. Nonetheless, here is still a major amount of work planned across the complex including major upgrades to the injectors in the framework of LIU, and significant upgrades to the LHC experiments.

The exploitation of the LHC and the injector complex has been impressive recently, but work across the Organization continues unabated in the push to get the best out of the LHC in both the medium and long term.

CMS undergoes tracker transplant

At the beginning of March, the CMS collaboration successfully replaced the heart of its detector: the pixel tracker. This innermost layer of the CMS detector, a cylindrical device containing 124 million sensitive silicon sensors that record the trajectories of charged particles, is the first to be encountered by debris from the LHC’s collisions.

CMS

The original three-layer 64 Mpix tracker, which has been in place since the LHC started operations in 2008, was designed for a lower collision rate than the LHC will deliver in the coming years. Its replacement contains an additional layer and has its first layer placed closer to the interaction point. This will enable CMS to cope with the harsher collision environment of future LHC runs, for which the detector has to simultaneously handle the products from a large number of simultaneous collisions. The new pixel detector will also be better at pinpointing where individual collisions occurred and will therefore enhance the precision with which predictions of the Standard Model can be tested.


After a week of intense activity, and a few frayed nerves, the new subdetector was safely in place by 8 March. After testing is complete, CMS will be closed ready for the LHC to return to action in May.

ATLAS reveals more strangeness in the proton

The excellent theoretical understanding of the production of electroweak W and Z gauge bosons in proton–proton collisions at the LHC makes these “standard-candle” processes ideal for studying the detailed performance of the ATLAS detector, and thus improves the precision on measurements. Specifically, differences in the couplings of the W+, W, Z and γ* bosons to quarks and antiquarks appear as differences in rapidity distributions that reveal additional information about the structure of the proton.

Protons are often considered to be composed of two up quarks and one down quark, but when probed at small distances they reveal additional content. This includes a “sea” of up and down quarks, strange quarks from the heavier second generation of particles, and the gluons that bind the quarks together into the proton.

ATLAS

The ATLAS collaboration has now shed light on the least-known component of the proton – its content of strange quarks – based on sub-per-cent measurements of the kinematic dependencies of the W and Z boson cross-sections using LHC data recorded in 2011 at an energy of 7 TeV. Previous determinations of the strange-quark content of the proton were based on neutrino scattering, in which charged-current interaction muons from the fragmentation of charm quarks were detected. Contrary to theoretical expectations, these data revealed a suppression of strange quarks relative to the up and down quarks.

Gaining further insight into the proton structure using inclusive W and Z boson production required significant experimental improvements, with painstaking calibration efforts revealing detection efficiencies in real and simulated data at the per-mille level using both the electron and muon channels. Indeed, thanks to these studies, the ATLAS data provided a new test of electron–muon universality in the weak-interaction sector that is in excellent agreement with the Standard Model at the sub-per-cent level.

The combined electron and muon data, including the correlations of systematic uncertainties, were compared to predictions performed at next-to-next-to leading order (NNLO) in QCD and next-to-leading order in electroweak theory. Using various parton distribution functions, the comparisons revealed significant tensions between measurement and theory. Interpreting HERA-inclusive deep-inelastic-scattering data including the ATLAS data in an NNLO QCD fit pointed to a new sensitivity to the strangeness suppression factor Rs = (s + s)/(d + u), as shown in the figure. The data confirm with significantly improved precision the previous ATLAS determination of an unsuppressed strange-quark content (shown as ATLAS-epWZ12) based on 2010 data.

The result may have important implications for further precision measurements of Standard Model parameters, in particular the mass of the W boson and the weak-mixing angle, since these are affected by the second generation of quarks. The ATLAS measurement challenges the current paradigm of a suppression of the strange- compared to other light-quark distributions, but the quest continues.

ALICE measures shape of the QGP fireball at freeze-out

Heavy-ion collisions at LHC energies create a hot and dense medium of deconfined quarks and gluons, known as the quark–gluon plasma (QGP). The QGP fireball first expands, cools and then freezes out into a collection of final-state hadrons. Correlations between the free particles carry information about the space–time extent of the emitting source, and are imprinted on the final-state spectra due to a quantum-mechanical interference effect. To measure these correlations and to determine the space–time parameters of the source, physicists utilise Hanbury Brown and Twiss (HBT) interferometry, a technique first used in astronomy for determining the angular sizes of stars. Using azimuthally differential HBT interferometry, the ALICE collaboration has recently measured the shape of the fireball at freeze-out.

ALICE

In a non-central collision, the nuclear overlap region is almond shaped with the longer axis oriented perpendicular to the reaction plane (defined by the impact parameter and the beam direction). The spatial anisotropies in the initial state are converted, via pressure gradients, to momentum anisotropies, leading to anisotropic particle flow. The magnitudes of the momentum anisotropies are quantified by the so-called vn coefficients, where the second harmonic coefficient (v2) is generated from the system’s approximately elliptic shape. This is usually called elliptic flow, and the direction of the strongest component of elliptic flow is defined as the elliptic-flow plane.

The HBT radius, measured as a function of the pair-emission azimuth relative to the elliptic-flow plane, exhibits oscillations and thus provides information on the eccentricity of the source at freeze-out, when the particles cease to interact. The source eccentricity at freeze-out can be estimated from oscillations of the HBT radius at low pion-pair transverse momentum. ALICE has measured the pion HBT-radius oscillations for different transverse-momentum ranges as a function of centrality in lead–lead collisions at an energy of 2.76 TeV per nucleon pair and plotted the results as a function of the initial eccentricity (see figure on previous page).

The final eccentricities are significantly below the initial eccentricities due to a larger expansion in the in-plane direction. The freeze-out eccentricities measured by ALICE are smaller than those measured at RHIC energies, likely reflecting the longer lifetime of the system at the LHC. Hydrodynamic calculations performed for similar centralities and pair transverse-momentum ranges as in the ALICE experiment show a similar trend, but predict smaller final-source eccentricity corresponding to a more spherical source.

The final-state source eccentricity remains positive for all the pair transverse-momentum ranges, indicating that even after a stronger expansion in the in-plane direction, the pion source at freeze-out is still elongated in the out-of-plane direction. In the future, the ALICE collaboration intends to measure the azimuthal dependence of the HBT radii relative to the higher-harmonic (n ≥ 3) flow planes, which is directly sensitive to anisotropies in the system’s collective velocity fields.

Rare decay puts Standard Model on the spot

The decay rate of the B0s meson to two muons is a flagship measurement in flavour physics. It is extremely rare and well predicted in the Standard Model (SM), with a branching fraction of (3.65±0.23) × 10–9. It proceeds via a loop diagram that involves the heaviest known particles: the Z and W bosons and the top quark. Any unknown heavier particles that exist are likely to also contribute to this decay, which makes it a very sensitive probe of physics beyond the SM. After three decades of unsuccessful searches, the observation of the decay was first announced in a joint paper in Nature in 2015 by the CMS and LHCb collaborations using LHC data from Run 1.

LHCb

Recently the LHCb collaboration reported an improved analysis of this decay with data from 2015 and 2016 added to the Run-1 sample. Work during the long shutdown allowed significant improvements to be made in background rejection, which increased the experiment᾿s sensitivity. The B0s → μ+μ peak is clearly visible in the resulting mass plot, with a small bump possibly due to the B0 meson to its left (see figure, top). The significance of the former is 7.8σ, corresponding to the first observation of this decay by a single experiment. At just 1.6σ, the B0 peak is not significant.

Using the well-known decays B0→ K+π and B+→ J/ψK+ to calibrate and normalise the efficiencies, the B0s → μ+μ branching fraction is measured to be (3.0±0.6) × 10–9, which is the most precise measurement to date. Although consistent with the SM, the experimental precision still has to improve before it matches the present theoretical accuracy.

For the first time, LHCb also measured the effective lifetime of the B0s → μ+μ decay. The Bs meson system has much in common with that of the K0 meson, in that it exhibits a heavier long-lived state and a lighter shorter-lived state. Only the former is allowed to decay into μ+μ in the SM, but that may not be the case in other scenarios. The contributions of the two states can be disentangled by fitting a single exponential to the lifetime distribution (figure, below). The fitted effective lifetime is consistent within 1σ with the hypothesis of only the heavier state contributing, and within 1.4σ of the opposite. While this result does not yet tell us anything about new physics, it allows the sensitivity to be extrapolated to larger data samples. With the 300 fb–1 integrated-luminosity target of the LHCb phase-II upgrade, the two states could be disentangled at the 5σ level and thus provide a new and important test of the SM.

BaBar casts further doubt on dark photons

Dark photons, are hypothetical low-mass spin-1 particles that couple to dark matter but have vanishing couplings with normal matter. Such a boson, which may be associated with a U(1) gauge symmetry in the dark sector and mix kinetically with the Standard Model photon, offers an explanation for puzzling astrophysical observations such as the positron abundance in cosmic rays reported by the PAMELA satellite. Dark photons have also been invoked as possible explanations to the muon g-2 anomaly.

Based on single-photon events in 53 fb1 of e+e collision data collected at the PEP-II B factory in SLAC, California, the BaBar collaboration has now completed a thorough search for these particles (Aʹ) via the process e+eγ Aʹ. The search was based on the assumption that the dark photon decays almost entirely to dark-matter particles and therefore that no energy would be deposited in the BaBar detector from its decay products. Finding no evidence for such processes, the analysis places 90% confidence-level upper limits on the coupling strength of Aʹ to e+e for dark photons lighter than 8 GeV. In particular, the BaBar limits exclude values of the Aʹ coupling suggested by the dark-photon interpretation of the muon g-2 anomaly, as well as a broad range of parameters for dark-sector models (see figure).

“This paper is the final word from BaBar on a search where the dark photon decays invisibly,” says BaBar spokesperson Michael Roney. “But we are continuing to search for dark photons and other dark-sector particles that have visible decay modes.”

The BaBar result follows another direct search for sub-GeV dark photons carried out recently by CERN’s NA64 experiment, in which electrons incident on an active target probe the process e Z  e Z Aʹ. Again, no evidence for such decays was found, and NA64 was able to exclude dark photons with a mass less than around 0.1 GeV.

“The thing is, there are dark photons and dark photons,” says theorist  Sean Carroll of Caltech, who has worked on dark-photon models. “In contrast to massless dark photons, which are analogous to ordinary photons, this experiment constrains a slightly different idea of dark force-carrying particles that are associated with a broken symmetry, which therefore get a mass and then can decay. They are more like ‘dark Z bosons’ than dark photons.”

Gravitational lens challenges cosmic expansion

Using galaxies as vast gravitational lenses, an international group of astronomers has made an independent measurement of how fast the universe is expanding. The newly measured expansion rate is consistent with earlier findings in the local universe based on more traditional methods, but intriguingly remains higher than the value derived by the Planck satellite – a tension that could hint at new physics.

The rate at which the universe is expanding, defined by the Hubble constant, is one of the fundamental quantities in cosmology and is usually determined by techniques that use Cepheid variables and supernovae as points of reference. A group of astronomers from the H0LiCOW collaboration led by Sherry Suyu of the Max Planck Institute for Astrophysics in Germany, ASIAA in Taiwan and the Technical University of Munich, used gravitational lensing to provide an independent measurement of this constant. The gravitational lens is made of a galaxy that deforms space–time and hence bends the light travelling from a background quasar, which is an extremely luminous and variable galaxy core. This bending results in multiple images, as seen from Earth, of the same quasar that are almost perfectly aligned with the lensing galaxy (see image).

While being simple in theory, in practice the new technique is rather complex. A straightforward equation relates the Hubble constant to the length of the deflected light rays between the quasar and Earth. Since the brightness of a quasar changes over time, astronomers can see the different images of the quasar flicker at different times, and the delays between them depend on the lengths of the paths the light has taken. Deriving the Hubble constant therefore depends on very precise modelling of the distribution of the mass in the lensing galaxy, as well as on several hundred accurate measurements of the multiple images of the quasar to derive its variability pattern over many years.

A possible explanation of this discrepancy… could involve an additional source of dark radiation in the early universe.

This complexity explains why the measurement of the Hubble constant – reported in a separate publication by H0LiCOW collaborator Vivien Bonvin from the EPFL in Switzerland and co-workers – relies on a total of four papers by the H0LiCOW collaboration. The obtained value of H0 = 71.9±2.7 km s–1 Mpc–1 is in excellent agreement with other recent determinations in the local universe using classical cosmic-distance ladder methods. One of these, by Adam Riess and collaborators, finds an even higher value of the Hubble constant (H0 = 73.2±1.7 km s–1 Mpc–1) and has therefore triggered a lot of interest in recent months.

The reason is that such values are in tension with the precise determination of the Hubble constant by the Planck satellite. Assuming standard “Lambda Cold Dark Matter” cosmology, the Planck collaboration derived from the cosmic-microwave-background radiation a value of H0 = 67.9±1.5 km s–1 Mpc–1 (CERN Courier May 2013 p12). The discrepancy between Planck’s probe of the early universe and local values of the Hubble constant could be an indication that we are missing a vital ingredient in our current understanding of the universe.

A possible explanation of this discrepancy, according to Riess and colleagues, could involve an additional source of dark radiation in the early universe, corresponding to a significant increase in the effective number of neutrino species. It will be interesting to follow this debate in the coming years, when new observing facilities and also new parallax measurements of Cepheid stars by the Gaia satellite will reduce the uncertainty of the Hubble constant determination to a per cent or less.

bright-rec iop pub iop-science physcis connect