Comsol -leaderboard other pages

Topics

Stable beams at 13 TeV

At 10.40 a.m. on 3 June, the LHC operators declared “stable beams” for the first time at a beam energy of 6.5 TeV. It was the signal for the LHC experiments to start taking physics data for Run 2, this time at a collision energy of 13 TeV – nearly double the 7 TeV with which Run 1 began in March 2010. After a shutdown of almost two years and several months re-commissioning without and with beam, the world’s largest particle accelerator was back in business. Under the gaze of the world via a live webcast and blog, the LHC’s two counter-circulating beams, each with three bunches of nominal intensity (about 1011 protons per bunch), were taken through the full cycle from injection to collisions. This was followed by the declaration of stable beams and the start of Run 2 data taking.

The occasion marked the nominal end of an intense eight weeks of beam commissioning (CERN Courier May 2015 p5 and June 2015 p5) and came just two weeks after the first test collisions at the new record-breaking energy. On 20 May at around 10.30 p.m., protons collided in the LHC at 13 TeV for the first time. These test collisions were to set up various systems, in particular the collimators, and were established with beams that were “de-squeezed” to make them larger at the interaction points than during standard operation. This set-up was in preparation for a special run for the LHCf experiment (“LHCf makes the most of a special run”), and for luminosity calibration measurements by the experiments where the beams are scanned across each other – the so-called “van der Meer scans”.

Progress was also made on the beam-intensity front, with up to 50 nominal bunches per beam brought into stable beams by mid-June. There were some concerns that an unidentified obstacle in the beam pipe of a dipole in sector 8-1 could be affected by the higher beam currents. This proved not to be the case – at least so far. No unusual beam losses were observed at the location of the obstacle, and the steps towards the first sustained physics run continued.

The final stages of preparation for collisions involved setting up the tertiary collimators (CERN Courier September 2013 p37). These are situated on the incoming beam about 120–140 m from the interaction points, where the beams are still in separate beam pipes. The local orbit changes in this region both during the “squeeze” to decrease the beam size at the interaction points and after the removal of the “separation bumps” (produced by corrector magnets to keep the beams separated at the interaction points during the ramp and squeeze). This means that the tertiary collimators must be set up with respect to the beam, both at the end of the squeeze and with colliding beams. In contrast, the orbit and optics at the main collimator groupings in the beam-cleaning sections at points 7 and 3 are kept constant during the squeeze and during collisions, so their set-up remains valid throughout all of the high-energy phases.

By the morning of 3 June, all was ready for the planned attempt for the first “stable beams” of Run 2, with three bunches of protons at nominal intensity per beam. At 8.25 a.m, the injection of beams of protons from the Super Proton Synchrotron to the LHC was complete, and the ramp to increase the energy of each beam to 6.5 TeV began. However, the beams were soon dumped in the ramp by the software interlock system. The interlock was related to a technical issue with the interlocked beam-position monitor system, but this was rapidly resolved. About an hour later, at 9.46 a.m, three nominal bunches were once more circulating in each beam and the ramp to 6.5 TeV had begun again.

At 10.06 a.m., the beams had reached their top energy of 6.5 TeV and the “flat top” at the end of the ramp. The next step was the “squeeze”, using quadrupole magnets on both sides of each experiment to decrease the size of the beams at the interaction point. With this successfully completed by 10.29 a.m., it was time to adjust the beam orbits to ensure an optimal interaction at the collision points. Then at 10.34 a.m., monitors showed that the two beams were colliding at a total energy of 13 TeV inside the ATLAS and CMS detectors; collisions in LHCb and ALICE followed a few minutes later.

At 10.42 a.m., the moment everyone had been waiting for arrived – the declaration of stable beams – accompanied by applause and smiles all round in the CERN Control Centre. “Congratulations to everybody, here and outside,” CERN’s director-general, Rolf Heuer, said as he spoke with evident emotion following the announcement. “We should remember this was two years of teamwork. A fantastic achievement. I am touched. I hope you are also touched. Thanks to everybody. And now time for new physics. Great work!”

The eight weeks of beam commissioning had seen a sustained effort by many teams working nights, weekends and holidays to push the programme through. Their work involved optics measurements and corrections, injection and beam-dump set-up, collimation set-up, wrestling with various types of beam instrumentation, optimization of the magnetic model, magnet aperture measurements, etc. The operations team had also tackled the intricacies of manipulating the beams through the various steps, from injection through ramp and squeeze to collision. All of this was backed up by the full validation of the various components of the machine-protection system by the groups concerned. The execution of the programme was also made possible by good machine availability and the support of other teams working on the injector complex, cryogenics, survey, technical infrastructure, access, and radiation protection.

Over the two-year shutdown, the four large experiments ALICE, ATLAS, CMS and LHCb also went through an important programme of maintenance and improvements in preparation for the new energy frontier.

Among the consolidation and improvements to 19 subdetectors, the ALICE collaboration installed a new dijet calorimeter to extend the range covered by the electromagnetic calorimeter, allowing measurement of the energy of the photons and electrons over a larger angle (CERN Courier May 2015 p35). The transition-radiation detector that detects particle tracks and identifies electrons has also been completed with the addition of five more modules.

A major step during the long shutdown for the ATLAS collaboration was the insertion of a fourth and innermost layer in the pixel detector, to provide the experiment with better precision in vertex identification (CERN Courier June 2015 p21). The collaboration also used the shutdown to improve the general ATLAS infrastructure, including electrical power, cryogenic and cooling systems. The gas system of the transition-radiation tracker, which contributes to the identification of electrons as well as to track reconstruction, was modified significantly to minimize losses. In addition, new chambers were added to the muon spectrometer, the calorimeter read-out was consolidated, the forward detectors were upgraded to provide a better measurement of the LHC luminosity, and a new aluminium beam pipe was installed to reduce the background.

To deal with the increased collision rate that will occur in Run 2 – which presents a challenge for all of the experiments – ATLAS improved the whole read-out system to be able to run at 100 kHz and re-engineered all of the data acquisition software and monitoring applications. The trigger system was redesigned, going from three levels to two, while implementing smarter and faster selection-algorithms. It was also necessary to reduce the time needed to reconstruct ATLAS events, despite the additional activity in the detector. In addition, an ambitious upgrade of simulation, reconstruction and analysis software was completed, and a new generation of data-management tools on the Grid was implemented.

The biggest priority for CMS was to mitigate the effects of radiation on the performance of the tracker, by equipping it to operate at low temperatures (down to –20 °C). This required changes to the cooling plant and extensive work on the environment control of the detector and cooling distribution to prevent condensation or icing (CERN Courier May 2015 p28). The central beam pipe was replaced by a narrower one, in preparation for the installation in 2016–2017 of a new pixel tracker that will allow better measurements of the momenta and points of origin of charged particles. Also during the shutdown, CMS added a fourth measuring station to each muon endcap, to maintain discrimination between low-momentum muons and background as the LHC beam intensity increases. Complementary to this was the installation at each end of the detector of a 125 tonne composite shielding wall to reduce neutron backgrounds. A luminosity-measuring device, the pixel luminosity telescope, was installed on either side of the collision point around the beam pipe.

Other major activities for CMS included replacing photodetectors in the hadron calorimeter with better-performing designs, moving the muon read-out to more accessible locations for maintenance, installation of the first stage of a new hardware triggering system, and consolidation of the solenoid magnet’s cryogenic system and of the power distribution. The software and computing systems underwent a significant overhaul during the shutdown to reduce the time needed to produce analysis data sets.

To make the most of the 13 TeV collisions, the LHCb collaboration installed the new HeRSCheL detector – High Rapidity Shower Counters for LHCb. This consists of a system of scintillators installed along the beamline up to 114 m from the interaction point, to define forward rapidity gaps. In addition, one section of the beryllium beam pipe was replaced and the new beam pipe support-structure is now much lighter.

The CERN Data Centre has also been preparing for the torrent of data expected from collisions at 13 TeV. The Information Technology department purchased and installed almost 60,000 new cores and more than 100 PB of additional disk storage to cope with the increased amount of data that is expected from the experiments during Run 2. Significant upgrades have also been made to the networking infrastructure, including the installation of new uninterruptible power supplies.

First stable beams was an important step for LHC Run 2, but there is still a long way to go before this year’s target of around 2500 bunches per beam is reached and the LHC starts delivering some serious integrated luminosity to the experiments. The LHC and the experiments will now run around the clock for the next three years, opening up a new frontier in high-energy particle physics.

• Complied from articles in CERN’s Bulletin and other material on CERN’s website. To keep up to date with progress with the LHC and the experiments, follow the news at bulletin.cern.ch or visit www.cern.ch.

LHC and Planck: where two ends meet

Over the past decade and more, cosmology on one side and particle physics on the other have approached what looks like a critical turning point. The theoretical models that for many years have been the backbone of research carried out in both fields – the Standard Model for particle physics and the Lambda cold dark matter (ΛCDM) model for cosmology – are proving insufficient to describe more recent observations, including those of dark matter and dark energy. Moreover, the most important “experiment” that ever happened, the Big Bang, remains unexplained. Physicists working at both extremes of the scale – the infinitesimally small and the infinitely large – face the same problem: they know that there is much to search for, but their arms seem too short to reach still further distances. So, while researchers in the two fields maintain their specific interests and continue to build on their respective areas of expertise, they are also looking increasingly at each other’s findings to reconstitute the common mosaic.

Studies on the nature of dark matter are the most natural common ground between cosmology and particle physics. Run 2 of the LHC, which has just begun, is expected to shed some light on this area. Indeed, while the main outcome of Run 1 was undoubtedly the widely anticipated discovery of a Higgs boson, Run 2 is opening the door to uncharted territory. In practical and experimental terms, exploring the properties and the behaviour of nature at high energy consists in understanding possible signals that include “missing energy”. In the Standard Model, this energy discrepancy is associated with neutrinos, but in physics beyond the Standard Model, the missing energy could also be the signature of many undiscovered particles, including the weakly interacting massive particles (WIMPs) that are among the leading candidates for dark matter. If WIMPs exist, the LHC’s collisions at 13 TeV may reveal them, and this will be another huge breakthrough. Because supersymmetry has not yet been ruled out, the high-energy collisions might also eventually unveil the supersymmetric partners of the known particles, at least the lighter ones. Missing energy could also account for the escape of a graviton into extra dimensions, or a variety of other possibilities. Thanks to the LHC’s Run 1 and other recent studies, the Standard Model is so well known that future observation of an unknown source of missing energy could be confidently linked to new physics.

Besides the search for dark matter, another area where cosmology and particle physics meet is in neutrino physics. The most recent result that collider experiments have published for the number of standard (light) neutrino types is Nν = 2.984±0.008 (ALEPH et al. 2006). While the search for a fourth right-handed neutrino is continuing with ground-based experiments, satellite experiments have shown that they can also have their say. Indeed, recent results from ESA’s Planck mission yield Neff = 3.04±0.18 for the effective number of relativistic degrees of freedom, and the sum of neutrino masses is constrained to Σmν < 0.17 eV. These values, derived from Planck’s data of temperature and polarization CMB anisotropies in combination with data from baryonic acoustic oscillation experiments, are consistent with standard cosmological and particle-physics predictions in the neutrino sector (Planck Collaboration 2015a). Although these values do not completely rule out a sterile neutrino, especially if thermalized at a different background temperature, its existence is disfavoured by the Planck data (figure 1).

Ground-based experiments have observed the direct oscillation of neutrinos, which proves that these elusive particles have a nonzero mass.

Working out absolute neutrino masses is no easy task. Ground-based experiments have observed the direct oscillation of neutrinos, which proves that these elusive particles have a nonzero mass. However, no measurement of absolute masses has been performed yet, and the strongest upper limit (about one order of magnitude more accurate than direct-detection measurements) on their sum comes from cosmology. Because neutrinos are the most abundant particles with mass in the universe, the influence of their absolute mass on the formation of structure is as big as their role in many physics processes observed at small scales. The picture in the present Standard Model might suggest (perhaps naively) that the mass distribution among the neutrinos could be similar to the mass distribution among the other particles and their families, but only experiments such as KATRIN – the Karslruhe Tritium Neutrino experiment – are expected to shed some light on this topic.

In recent years, cosmologists and particle physicists have shown a common interest in testing Lorentz and CPT invariances. The topic seems to be particularly relevant for theorists working on string theories, which sometimes involve mechanisms that lead to a spontaneous breaking of these symmetries. To find possible clues, satellite experiments are probing the cosmic microwave background (CMB) to investigate the universe’s birefringence, which would be a clear signature of Lorentz invariance and, therefore, CPT violation. So far, the CMB experiments WMAP, QUAD and BICEP1 have found a value of α – the rotation angle of the photon-polarization plane – consistent with zero. Results from Planck on the full set of observations are expected later this year.

Since its discovery in 2012, the Higgs boson found at the LHC has been in the spotlight for physicists studying both extremes of the scale. Indeed, in addition to its confirmed role in the mass mechanism, recent papers have discussed its possible role in the inflation of the universe. Could a single particle be the Holy Grail for cosmologists and particle physicists alike? It is a fascinating question, and many studies have been published about the particle’s possible role in shaping the early history of the universe, but the theoretical situation is far from clear. On one side, the Higgs boson and the inflaton share some basic features, but on the other side, the Standard Model interactions do not seem sufficient to generate inflation unless there is an anomalously strong coupling between the Higgs boson and gravity. Such strong coupling is a highly debated point among theoreticians. Also in this case, the CMB data could help to rule out or disentangle models. Recent full mission data from Planck clearly disfavour natural inflation compared with models that predict a smaller tensor-to-scalar ratio, such as the Higgs inflationary model (Planck Collaboration 2015b). However, the question remains open, and subject to new information coming from the LHC’s future runs and from new cosmological missions.

AMS now has results based on more than 6 × 1010 cosmic-ray events.

In the meantime, astroparticle physics is positioning itself as the area where both cosmology and particle physics could find answers to the open questions. An event at CERN in April provided a showcase for experiments on cosmic rays and dark matter, in particular the latest results from the Alpha Magnetic Spectrometer (AMS) collaboration on the antiproton-to-proton ratio in cosmic rays and on the proton and helium fluxes. Following earlier measurements by PAMELA – the Payload for Antimatter Matter Exploration and Light nuclei Astrophysics – which took data in 2006–2011, AMS now has results based on more than 6 × 1010 cosmic-ray events (electrons, positrons, protons and antiprotons, as well as nuclei of helium, lithium, boron, carbon, oxygen…) collected during the first four years of AMS-02 on board the International Space Station. With events at energies up to many tera-electron-volts, and with unprecedented accuracy, the AMS data provide systematic information on the deepest nature of cosmic rays. The antiproton-to-proton ratio measured by AMS in the energy range 0–500 GeV shows a clear discrepancy with existing models (figure 2). Anomalies are also visible in the behaviour of the fluxes of electrons, positrons, protons, helium and other nuclei. However, although a large part of the scientific community tends to interpret these observations as a new signature of dark matter, the origin of such unexpected behaviour cannot be easily identified, and discussions are still ongoing within the community.

It may seem that the universe is playing hide-and-seek with cosmologists and particle physicists alike as they probe both ends of the distance scale. However, the two research communities have a new smart move up their sleeves to unveil its secrets – collaboration. Bringing together the two ends of the scales probed by the LHC and by Planck will soon bear its fruits. Watch this space!

Frontier detectors for the future

The last week of May saw a gathering of 390 physicists from 27 countries and four continents on the island of Elba. The 13th edition of the Pisa Meeting on Advanced Detectors for Frontier Physics took place in the secluded Biodola area. The conference, which takes place every three years, is based on a consolidated format, aiming at an interdisciplinary exchange of ideas: all sessions are plenary, with a round table on a topic of interest (CERN Courier July/August 2006 p31). The programme for this year was built on a record number of contributions (more than 400), out of which 327 were selected for either oral (66) or poster presentations. Eight industries were present throughout the meeting, with stands to display their products and to discuss ongoing and future R&D projects.

The opening session saw an introductory talk by Toni Pich of Valencia that described the situation in frontier physics today. The discovery of a particle associated with the Brout–Englert–Higgs mechanism has opened a whole new field of investigation to explore, in addition to the “known unknowns”. Among these, revealing the nature of dark matter and of neutrino masses is the main priority. In the following talk, CERN’s Michelangelo Mangano discussed the search for supersymmetry, as well as different possibilities for signals of new physics that will be explored with high priority from the start of Run 2 at the LHC.

A key event was the round table organized on the second day of the meeting, with 13 people representing nine laboratories (CERN, the Institute of High Energy Physics (IHEP) in Beijing, Fermilab, PSI, TRIUMF, the European Spallation Source, KEK and the Japan Proton Accelerator Research Complex) and four funding agencies (the US Department of Energy, the Institut national de physique nucléaire et de physique des particules (IN2P3), the Istituto Nazionale di Fisica Nucleare (INFN) and the UK’s Science and Technology Facilities Council). The topic for discussion was “Synergies and complementarity among laboratories”, in view of the challenges of the coming decades and of the growing role of CERN as the place where the energy frontier will be explored. The presentation about the future of high-energy physics in China by Gang Chen of IHEP was particularly enlightening, giving a perspective and an impressive plan spanning the middle of this century. Representatives of the funding agencies discussed the nearer future, where – besides the High Luminosity LHC project – a strong neutrino programme is foreseen. The lively exchange among the scientists at the table and participants on the floor left everyone with a vivid perception that what Sergio Bertolucci, CERN’s director for research and computing, defined as “co-opetition” among different institutions in high-energy physics, must move forward and become part of the texture of daily work. Several participants stressed that although CERN is central, regional laboratories have an important role because they relate directly to the host nations. Demonstrating the societal impact of research in high-energy physics to politicians and to the public at large is a key point in obtaining support for the whole field.

Each Pisa meeting has a number of standard sessions on gas and solid-state detectors, particle and photon identification, calorimetry and advanced electronics, astroparticle physics, and the application of high-energy-physics techniques in other fields. The presentations, both oral and with posters, demonstrated that significant improvements in existing detectors and current techniques are still possible. The topics presented covered dedicated R&D as well as novel ideas, some developed in a beneficial crossover with other areas, ranging from material science to nanotechnology and chemistry. In a dedicated session, speakers from the LHC experiments noted that the detectors are now performing well and are ready to help harvest the physics at 13 TeV that will come from the LHC’s Run 2.

As the field keeps changing, so does the conference. This year, a new session was introduced to offer adequate space to applied superconductivity. The technique is now fundamental, not just to provide stronger magnetic fields for accelerators and spectrometers, but also in specialized detectors. The review talk by Akira Yamamoto of KEK and CERN outlined the new frontier of superconducting magnets, both in terms of achievable field and of stored energy/mass ratio. Emanuela Barzi and Alexander Zoblin presented the R&D programme for high-field superconducting magnets at Fermilab. The laboratory that pioneered the use of superconducting magnets in accelerators now aims to be able to build magnets suitable for the Future Circular Collider design study (CERN Courier April 2014 p16). The use of superconducting materials to detect photons was discussed in two talks, by Martino Calvo of CNRS Grenoble and Roberto Leoni of IFN-CNR, Rome. The use of cryogenic detectors – bolometers, kinetic-inductance detectors, transition-edge sensors, to name but a few – was discussed by Flavio Gatti of INFN Genova, in a review of the large number of posters on the subject presented at the conference.

The meeting saw the awarding of the first Aldo Menzione Prize. Among his many activities, Aldo was one of the founders of the Pisa meeting and recipient of the W K H Panokfsky Prize in 2009. He passed away in December 2012 (CERN Courier April 2013 p37), and to honour his memory, the Frontier Detectors for Frontier Physics (FDFP) association that organizes the conference series, established an award to be assigned at each meeting to “a distinguished scientist who has contributed to the development of detector techniques”. The recipients of the prize on this first occasion were David Nygren, now of the University of Texas at Arlington, for the invention of the time-projection chamber, and Fabio Sauli, now of the TERA Foundation, for the invention of the gas electron-multiplier, or GEM. The prizes were presented by Donata Foà, Aldo’s widow, and Angelo Scribano, the president of the FDFP.

At the end of the conference dinner, several awards were also assigned by an international jury. Elsevier established two Elsevier Young Scientist Awards to honour the late Glenn Knoll, who was an editor of Nuclear Instruments and Methods (NIM). These were presented by Fabio Sauli, on behalf of NIM, to Filippo Resnati of CERN and Joana Wirth of the Technische Universität München, respectively, for his talk on the “Charge transfer properties through graphene for applications in gaseous detectors”, and for her poster on “CERBEROS: a tracking system for secondary pion beams at the HADES spectrometer”. Three FDFP awards to “talented young scientists active in the development of detection techniques and contributing, by talk or poster, to the scientific programme” were conferred by Angelo Scribano to Lars Graber of the University of Göttingen for his talk on “A 3D diamond detector for particle tracking”, Roberto Acciarri of Fermilab for a poster on “Experimental study of breakdown electric fields in liquid argon” and Raffaella Donghia of INFN-LNF for her poster on “Time performances and irradiation tests of CsI crystals read-out by MPPC”.

Concluding the conference, the chair, Marco Grassi of INFN-Pisa, provided a few statistics. He remarked that 36% of the participants were below 35 years old and nearly all of them – 96% – contributed to the conference programme with oral presentations or posters. This demonstrates that the field of detector development is attractive and has a strong basis on which it can grow, as long as, at a national level, institutes can continue to recruit these young scientists. This, as Catherine Clerc from IN2P3 reminded everybody during the round table, is the most pressing challenge in many European countries.

• For further information, visit the conference website https://agenda.infn.it/conferenceDisplay.py?confId=8397, where all of the presentations (oral and posters) are available.

The LHC prepares for high-energy collisions

A proton-proton collision at 900GeV

Following the restart of CERN’s flagship accelerator in early April, commissioning the LHC with beam is progressing well. In the early hours of 10 April, the operations team successfully circulated a beam at 6.5 TeV for the first time – a new world record – but this was only one of many steps to be taken before the accelerator delivers collisions at this beam energy.

The operators reached another important milestone on 21 April, when they succeeded in circulating a nominal-intensity bunch. The first commissioning steps in particular take place with low-intensity (probe) beams – single bunches of 5 × 109 protons. The nominal intensity, in contrast, is a little over 1 x 1011 protons per bunch, and when the LHC is in full operation later this year, some 2800 bunches will circulate in each beam.

To handle the higher number of protons per bunch and the higher number of bunches safely, a number of key systems have to be fully operational and set up with beam. These include the beam-dump system, the beam-interlock system and the collimation system. The latter involves around 100 individual pairs of jaws, each of which has to be positioned with respect to the beam during all of the phases of the machine cycle. Confirmation that everything is as it should be is made by deliberately provoking beam losses and checking that the collimators catch the losses as they are supposed to.

On 2 May, this set-up procedure allowed a nominal-intensity bunch in each beam to be taken to 6.5 TeV. Four days later, collisions were produced at the injection energy of 450 GeV, enabling the experiment teams to record events and check alignment and synchronization of the detectors. One of the important steps in reaching this stage is to commission the “squeeze” – the final phase in the LHC cycle of injection, ramp and squeeze. During this phase, the strengths of the magnetic fields either side of a given experiment are adjusted to reduce the beam size at the corresponding interaction point.

• To find out more, see the LHC reports in CERN Bulletin: bulletin.cern.ch.

Turkey becomes associate member state of CERN

The Republic of Turkey became an associate member state of CERN on 6 May, following notification that Turkey has ratified an agreement signed last year, granting this status to the country. Turkey’s new status will strengthen the long-term partnership between CERN and the Turkish scientific community. Associate membership will allow Turkey to attend meetings of the CERN Council. Moreover, it will allow Turkish scientists to become members of the CERN staff, and to participate in CERN’s training and career-development programmes. Finally, it will allow Turkish industry to bid for CERN contracts, thus opening up opportunities for industrial collaboration in areas of advanced technology.

CMS identifies Higgs bosons decaying to bottom quarks

CCnew5_05_15

The mass of the Higgs boson discovered at CERN is close to 125 GeV. If it really is the Standard Model Higgs boson (H), it should decay predominantly into a bottom quark–antiquark pair (bb), with a probability of about 58%. Therefore, the observation and study of the H → bb decay, which involves the direct coupling of H to fermions and in particular to down-type quarks like d-, s- and b-quarks, is essential in determining the nature of the discovered boson. The inclusive observation of the decay H → bb is currently not achievable at the LHC: in proton–proton collisions, bb pairs are produced abundantly via the strong force as described via QCD, providing a completely irreducible background.

CCnew4_05_15

An intriguing and challenging way to search for H → bb is through the mechanism of vector-boson fusion (VBF). In this case, the signal features a four-jet final state: two b-quark (bb) jets originating from the Higgs-boson decay, and two light quark (qq) jets, predominantly in the forward and backward directions with respect to the beamline – a distinctive signature of VBF in proton collisions. An additional peculiar feature of VBF is that no QCD colour is exchanged in the processes. This leads to the expectation of a “rapidity gap” – that is, reduced hadronic activity between the two tagging qq jets, apart from Higgs boson decay products.

CMS has searched for these VBF-produced Higgs bosons decaying to b quarks in the 2012 8-TeV proton–proton collision data. This is the only fully hadronic final state that is employed to search for a Standard Model Higgs boson at the LHC. A crucial dedicated data-triggering strategy was put in place, both within standard “prompt” data streams and, in parallel, within “parked” data streams that were reconstructed later, during the LHC shutdown. Candidate events are required to have four jets with transverse momenta above optimized thresholds. Separation in terms of pseudorapidity (angle) and b-quark tagging criteria are employed to assign two jets to the bb system and the other two jets to the qq VBF-tagging jet system.

CCnew6_05_15

Selected events are passed to a multi-variate boosted decision tree (BDT) trained to separate signal events from the large background of multi-jet events produced by QCD. The events are categorized according to the output values of the BDT, making no use of the kinematic information of the two b-jet candidates. Subsequently, the invariant-mass distribution of two bjets is analysed in each category, to search for a signal “bump” on top of the smooth background shape. The figure shows the results of the fit in the best signal category. They reveal an observed (expected) significance of the signal of 2.2 (0.8)σ, for a Higgs-boson mass of 125 GeV. A parallel measurement of Z → bb decays in the selected data samples, using the same signal-extraction technique, has been performed to validate the analysis strategy.

The results of this search have been combined with results of other CMS searches for the decay of the Higgs boson to bottom quarks, produced in association with a vector boson, or with a top-quark pair. For m= 125 GeV, the combination yields a fitted H → bb signal strength μ = 1.03 + 0.44, relative to the expectations of the Standard Model, with a significance of 2.6σ. This is a convincing hint from the LHC for the coupling of the discovered boson to bottom quarks.

First full jet measurement in Pb–Pb collisions with ALICE

ALICE

In high-energy collisions at the LHC, quarks and gluons occasionally scatter violently and produce correlated showers of particles, or “jets”. In proton–proton collisions, the rate of such scatters is precisely calculable using perturbative QCD. However, in heavy-ion collisions, jets should be modified, because the scattered quarks and gluons are expected to interact with the surrounding hot nuclear matter, the quark–gluon plasma (QGP). Jet measurements, together with model calculations of the “jet quenching” phenomenon, therefore provide important information about the properties of the QGP.

Fully reconstructed jets are measured in ALICE by high-precision tracking of charged particles in the central barrel, and by measuring the energy deposits of neutral particles in the electromagnetic calorimeter. This method of reconstructing jets differs from the more traditional approach with hadronic and electromagnetic calorimetry. It was first applied in ALICE to determine the production rate for jets in the case of proton–proton collisions (CERN Courier May 2013 p31). In heavy-ion collisions, measurements of jets are more challenging, because a single event contains multiple jets from independent nucleon–nucleon scatters, as well as combinatorial jets from the large and partially correlated underlying background of particles with low transverse momentum (pT).

ALICE has recently published results from the 2011 lead–lead (Pb–Pb) run, down to low jet pT, where jet quenching is expected to be most dramatic. Jets were reconstructed using the anti-kT algorithm with a resolution parameter of R = 0.2. Even for this rather small cone size, the average contribution of the background was measured to be 25±5 GeV/c in the 0–10% most central (highest multiplicity) Pb–Pb events. To deal with the background, the analysis first subtracted the average contribution in a given event jet-by-jet, and then corrected the resulting reconstructed jet spectrum for the background fluctuations and instrumental resolution via an unfolding procedure. This led to an overall systematic uncertainty of about 15–20%.

The nuclear modification of the jet yield (RAA) is quantified by the ratio of the jet spectrum measured in Pb–Pb to that in proton–proton collisions scaled by the number of independent nucleon–nucleon collisions. The figure shows RAA for the 0–10% and 10–30% most central Pb–Pb collisions, together with the two model calculations. It reveals that the jets in Pb–Pb are suppressed strongly, almost independent of jet pT, with an average nuclear modification factor of 0.28±0.04 in 0–10% and 0.35±0.04 in 10–30% of Pb–Pb collisions. Both model calculations were able to predict the level of jet suppression, while one of them expected a slightly steeper increase with pT than the data. This new measurement, which uses jet constituents down to a few hundred MeV/c, even in Pb–Pb collisions, opens new perspectives for studying the QGP with ALICE.

ATLAS’s paths to the top-quark mass

The top quark is the heaviest elementary particle known currently, and its mass (mtop) is a fundamental parameter of the Standard Model. Its precise determination is essential for testing the consistency of the Standard Model and to constrain models of new physics. Now, ATLAS has released new measurements of mtop using events with one or two isolated charged leptons and jets in the final state – the lepton+jets and dilepton channel. The new results are based on proton–proton collision data taken at a centre-of-mass-energy of 7 TeV.

The measurements were obtained from the direct reconstruction of the top-quark final states, and use calibrations based on Monte Carlo simulation. In the analysis, for the first time, the lepton+jets channel mtop is determined simultaneously with a global jet-energy scale factor, thus exploiting information from the hadronically decaying W boson and a separate b-to-light-quark jet-energy scale factor – a technique that reduces the corresponding systematic uncertainties on mtop significantly. The measurement in the dilepton channel is based on the invariant mass of the two charged-lepton and b-quark-jet systems from top-quark-pair decays. The measurements in the two channels are largely uncorrelated, which allows their combination to yield a substantial improvement in precision. The result, mtop = 172.99±0.91 GeV, corresponds to a relative uncertainty of 0.5% (ATLAS 2015a).

These new measurements, together with the results from the fully hadronic decay channel (ATLAS 2015b), complete the suite of mtop results based on 7-TeV data that exploit top-quark-pair signatures. They are complemented by a result based on single-top-quark-enriched topologies, using 8-TeV data (ATLAS 2014a).

In the direct mass-reconstruction techniques described above, the extracted value of mtop corresponds to the parameter implemented in the Monte Carlo (mMCtop) whose relationship with the top-mass parameter in the Standard Model Lagrangian is not completely clear. The uncertainty relating the top mass in the Standard Model to mMCtop is a matter of debate, but is often estimated to be about 1 GeV, which is comparable to the present experimental precision.

ATLAS follows complementary paths to measure mtop by comparing the measurements of cross-sections for inclusive and differential top-quark-pair production with the corresponding theoretical calculations, which depend on the top-quark-pole mass mpoletop. To date, the most precise mpoletop determination is obtained from the differential cross-section measurements of top-quark-pair events with one additional jet. Using 7-TeV data, the measurement yields mpoletop = 173.7+2.3–2.1 GeV (ATLAS 2014b), which is compatible to the results from the direct reconstruction of the top-quark decays. The figure shows the ATLAS results for mtop, together with results from the Tevatron and the world average.

Upcoming results exploiting the full 8-TeV data seta, and data from LHC Run 2, will further improve understanding of the mass of the top quark and its theoretical interpretation.

COMPASS observes a new narrow meson

Mass spectrum for the f0(980)

The bulk of visible matter originates from the strong interactions between almost massless fundamental building blocks: quarks and antiquarks bound together by gluons. Although these interactions are described by QCD, the understanding of the underlying principle – of how exactly these building blocks form observable matter (hadrons), and which configurations are or are not realized in nature – has been a major challenge for a long time. The question of how hadrons are formed relates directly to the excitation spectrum of hadrons, in particular, mesons, which are made from quark–antiquark pairs. Theoretical predictions on the nature of hadronic bound-states, their masses and decays, have long been based on models, but direct QCD calculations performed on high-performance computers using a discretized space–time lattice are now also reaching a predictive level for new hadron states.

The finding was made using the COMPASS spectrometer to study peripheral (diffractive) reactions of pions.

For many years, experiments have searched for hadronic bound states with exotic contents, such as gluon-only states (glueballs) or multi-quark states with a molecular nature. Some candidates had been found in studying systems with light quarks (glueballs, hybrids) or, most recently, with heavy quarks, revealing the first evidence for explicit multi-quark systems, based on the characteristic combination of charge and flavour.

Mass-dependent phase variation

The COMPASS collaboration has recently observed the existence of an unusual meson made from light quarks at a mass of 1.42 GeV/c2. Since this mass region had been investigated for half a century, this new particle comes as a surprise, and its finding is by virtue of the world’s largest data sample for such studies. The particle is called the a1(1420), reflecting its properties of unit spin/isospin and positive parity, characteristic of the “a” mesons. The finding was made using the COMPASS spectrometer to study peripheral (diffractive) reactions of pions with a momentum of 190 GeV/c on a liquid-hydrogen target at CERN’s Super Proton Synchrotron. Despite its production rate of only 10–3 with respect to known mesons, the existence of the a1(1420) was clearly unravelled using an advanced complex analysis technique that allows a produced superposition of individual quantum states to be disentangled into the individual contributing components, both in terms of quantum numbers and decay paths. The unique signature for this particular observation is a strong narrow enhancement in the mass spectrum of this JPC = 1++ quantum state (figure opposite) in conjunction with an observed phase delay of about 1800 – which any wave undergoes when its frequency (mass) passes a resonance.

The a1(1420) is observed decaying only into the f0(980), which is often discussed as a molecular-type state, and an additional pion, so rendering it unique. Following first announcements of the finding, several explanations have already been put forward. They cover the interpretation of the a1(1420) as a molecular/tetraquark state partnering another known state f1(1420), as well as scenarios in which the a1(1420) is generated by long-range effects of different sorts, all involving the light meson a1(1260). However, despite some remarkable features, not all of the experimental findings can be reproduced by those explanations. Thus, the a1(1420) enters the club of resonances that are unexplained, although experimentally well established.

Laser set-up generates electron–positron plasma in the lab

More than 99% of the visible universe exists as plasma, the so-called fourth state of matter. Produced from the ionization of predominantly hydrogen- and helium-dominated gases, these electron–ion plasmas are ubiquitous in the local universe. An exotic fifth state of matter, the electron–positron plasma, exists in the intense environments surrounding compact astrophysical objects, such as pulsars and black holes, and until recently, such plasmas were exclusively the realm of high-energy astrophysics. However, an international team, led by Gianluca Sarri of Queen’s University of Belfast, together with collaborators in the UK, US, Germany, Portugal and Italy, has at last succeeded in producing a neutral electron–positron plasma in a terrestrial laboratory experiment.

Electron–positron plasmas display peculiar features when compared with the other states of matter, on account of the symmetry between the negatively charged and positively charged particles, which in this case have equal mass but opposite charge. These plasmas play a fundamental role in the evolution of extreme astrophysical objects, including black holes and pulsars, and are associated with the emission of ultra-bright gamma-ray bursts. Moreover, it is likely that the early universe in the leptonic era – that is, in the minutes following approximately one second after the Big Bang – consisted almost exclusively of a dense electron–positron plasma in a hot photon bath.

While production of positrons has long been achievable, the formation of a plasma of charge-neutral electron–positron pairs has remained elusive, owing to the practical difficulties in combining equal numbers of these extremely mobile charges. However, the recent success was made possible by looking at the problem from a different perspective. Instead of generating two separate electron and positron populations, and recombining them, it aimed to generate an electron–positron plasma directly, in situ.

These results represent a real novelty for experimental physics, and pave the way for a new experimental field of research.

In an experiment at the Central Laser Facility at the Rutherford Appleton Laboratory in the UK, Sarri and colleagues made use of a laser-induced plasma wakefield to accelerate an ultra-relativistic electron beam. They focused an ultra-intense and short laser pulse (around 40 fs) onto a mixture of nitrogen and helium gas to produce, in only a few millimetres, electrons with an average energy of the order of 500–600 MeV. This beam was then directed onto a thick slab of a material of high atomic number – lead, in this case – to initiate an electromagnetic cascade, in a mainly two-step process. First, high-energy bremsstrahlung photons are generated as electrons or newly generated positrons propagate through the electric fields of the nuclei. Then, electron–positron pairs are generated during the interactions of the high-energy photons with the same fields. Under optimum experimental conditions, the team obtained, at the exit of the lead slab, a beam of electrons and positrons in equal numbers and of sufficient density to allow plasma-like behaviour.

These results represent a real novelty for experimental physics, and pave the way for a new experimental field of research: the study of symmetric matter–antimatter plasmas in the laboratory. Not only will it allow a better understanding of plasma physics from a fundamental point of view, but it should also shed light on some of the most fascinating, yet mysterious, objects in the known universe.

• The Central Laser Facility is supported by the UK’s Science and Technology Facilities Council. This experiment is supported by the UK’s Engineering and Physical Science Research Council.

bright-rec iop pub iop-science physcis connect