Comsol -leaderboard other pages

Topics

ATLAS illuminates the Higgs boson at 13 TeV

The ATLAS collaboration has released a set of comprehensive results that illuminates the properties of the Higgs boson with improved precision, using its decay into two photons with LHC collisions recorded at a centre-of-mass energy of 13 TeV.

The Higgs-to-two-photons decay played a crucial role in the discovery of the Higgs boson in 2012 owing to the excellent mass resolution and well-modelled backgrounds in this channel. Following the discovery, the properties of the Higgs boson can be probed more precisely using the large 13 TeV dataset.

One major result of the new study is the measurement of the signal strength μ, defined as the ratio of the number of observed and expected Higgs boson events. The signal strength is measured to be μ = 0.99+0.15–0.14 – in good agreement with the Standard Model expectation. The precision could be improved by a factor of two with respect to the previous measurements at energies at 7 and 8 TeV. The precision of signal-strength measurements of individual Higgs boson production modes are also improved significantly thanks to a better understanding of the ATLAS detector, the increased rate of Higgs production at 13 TeV and the extended use of machine-learning techniques to identify specific production processes.

Another key result of the present study are the measurements of nine simplified template cross sections (STXS), which refer to the cross sections of specific Higgs production channels measured in different kinematic regions. Measurements of STXS are corrected for the impact of the Higgs-boson decay and incorporate the acceptance of the experiment, so that they can be combined across Higgs boson channels and experiments (see figure, left).

The properties of the Higgs boson are further investigated by measuring 20 differential and two double-differential cross sections. The Higgs boson transverse momentum (figure, right) and rapidity, the number and properties of jets produced in association with the Higgs boson, and several angular relations that allow us to probe its spin and CP quantum numbers are measured. Five of these distributions are used to search for new CP-even and CP-odd couplings between the Higgs boson and vector bosons or gluons. No significant deviations from the Standard Model predictions are observed.

Collectively, this new set of results at the highest LHC energies sheds light on the fundamental properties of the Higgs boson and extends our knowledge obtained from the first running period of the LHC.

Spotting the first extragalactic planets

Three decades since astronomers first detected planets outside our solar system, exoplanets are now being discovered at a rate of hundreds per year. Although it is reasonable to assume other galaxies than our own contain planets, no direct detections of such objects have been made owing to their small size and their large distances from Earth.

Now, however, radiation emitted around a distant black hole has revealed the existence of extragalactic planets in a galaxy 3.8 billion light years away, located between the black hole and us. The planets, which have no way of being directly detected using any kind of existing telescope, are visible thanks to the small gravitational distortions they inflict on X-rays emanating from the more distant black hole.

The discovery was made by Xinyu Dai and Eduardo Guerras from the University of Oklahoma in the US using data from the Chandra X-ray Observatory. The distant black hole in question, which forms the supermassive centre of the quasar RX J1131-1231, is surrounded by an accretion disk that heats up as it orbits and emits radiation at X-ray wavelengths. Thanks to a fortunate cosmic alignment, this radiation is amplified by gravitational lensing and therefore can be studied accurately. The lensing galaxy positioned between Earth and the quasar causes light from RX J1131-1231 to bend around it, appearing to us not as a normal point-source but as a ring with four bright spots (see figure). The spots are a result of radiation coming from the same location of the quasar, which initially followed different paths but ended up being directed to the Earth.

Dai and Guerras focused on the spectral features of iron, a strong emission line that reveals details of the accretion disk, and found that this emission line is not just shifted in energy but that the amount of the shift varies with time. Although a shift in the frequency of this line is common, for example due to relative velocities between observers, its position is generally very stable with time when studying a specific object. Based on the 38 times RX J1131-1231 had been observed by the Chandra satellite during the past decade, the Oklahoma duo found that the energy varied significantly between observations in all of the four bright points of the ring.

These observations thus form the best evidence for the existence of extragalactic planets.

This feature can be explained using microlensing. The intermediate lensing galaxy is not a uniform mass but rather consists of small point masses, mainly stars and planets. As the relatively small objects within the lensing galaxy move, the light from the quasar passing through it is deflected in slightly different ways, causing different parts of the accretion disk to be amplified at different levels over time. As the different parts of the disk appear to emit at different energies, the measured variations in the energy of this emission line can be explained by the movement of objects within the lensing galaxy. The question is: what objects could cause such changes over time scales of several years?

Stars, being so numerous and massive, are one good candidate explanation. But Dai and Guerras calculated that the chance for a star to cause such short-term variations is very small. A better candidate, suggest fits to analytical models, is unbound planets which do not orbit a star. The Chandra data were best described by a model in which, for each star, there are more than 2000 unbound planets with masses between that of the Moon and Jupiter. Although the exact population of such planets is not well known even for our own galaxy, their number is well within the existing constraints. These observations thus form the best evidence for the existence of extragalactic planets and, by also providing the number of such planets in that galaxy, teach us something about the number of unbound planets we can expect in our own galaxy.

We need to talk about the Higgs

It is just over five years ago that the discovery of the Higgs boson was announced, to great fanfare in the world’s media, as a crowning success of CERN’s Large Hadron Collider (LHC). The excitement of those days now seems a distant memory, replaced by a growing sense of disappointment at the lack of any major discovery thereafter.

While there are valid reasons to feel less than delighted by the null results of searches for physics beyond the Standard Model (SM), this does not justify a mood of despondency. A particular concern is that, in today’s hyper-connected world, apparently harmless academic discussions risk evolving into a negative outlook for the field in broader society. For example, a recent news article in Nature led on the LHC’s “failure to detect new particles beyond the Higgs”, while The Economist reported that “Fundamental physics is frustrating physicists”. Equally worryingly, the situation in particle physics is sometimes negatively contrasted with that for gravitational waves: while the latter is, quite rightly, heralded as the start of a new era of exploration, the discovery of the Higgs is often described as the end of a long effort to complete the SM.

Let’s look at things more positively. The Higgs boson is a totally new type of fundamental particle that allows unprecedented tests of electroweak symmetry breaking. It thus provides us with a novel microscope with which to probe the universe at the smallest scales, in analogy with the prospects for new gravitational-wave telescopes that will study the largest scales. There is a clear need to measure its couplings to other particles – especially its coupling with itself – and to explore potential connections between the Higgs and hidden or dark sectors. These arguments alone provide ample motivation for the next generation of colliders including and beyond the high-luminosity LHC upgrade.

So far the Higgs boson indeed looks SM-like, but some perspective is necessary. It took more than 40 years from the discovery of the neutrino to the realisation that it is not massless and therefore not SM-like; addressing this mystery is now a key component of the global particle-physics programme. Turning to my own main research area, the beauty quark – which reached its 40th birthday last year – is another example of a long-established particle that is now providing exciting hints of new phenomena (see Beauty quarks test lepton universality ). One thrilling scenario, if these deviations from the SM are confirmed, is that the new physics landscape can be explored through both the b and Higgs microscopes. Let’s call it “multi-messenger particle physics”.

How the results of our research are communicated to the public has never been more important. We must be honest about the lack of new physics that we all hoped would be found in early LHC data, yet to characterise this as a “failure” is absurd. If anything, the LHC has been more successful than expected, leaving its experiments struggling to keep up with the astonishing rates of delivered data. Particle physics is, after all, about exploring the unknown; the analysis of LHC data has led to thousands of publications and a wealth of new knowledge, and there is every possibility that there are big discoveries waiting to be made with further data and more innovative analyses. We also should not overlook the returns to society that the LHC has brought, from technology developments with associated spin-offs to the training of thousands of highly skilled young researchers.

The level of expectation that has been heaped on the LHC seems unprecedented in the history of physics. Has any other facility been considered to have produced disappointing results because only one Nobel-prize winning discovery was made in its first few years of operation? Perhaps this reflects that the LHC is simply the right machine at the right time, but that time is not over: our new microscope is set to run for the next two decades and bring physics at the TeV scale into clear focus. The more we talk about that, the better our long-term chances of success.

To explore all our coverage marking the 10th anniversary of the discovery of the Higgs boson ...

ESO

The new ExTrA facility

A new national facility at La Silla Observatory in Chile, operated by the European Southern Observatory (ESO), made its first observations at the beginning of the year. ExTrA (Exoplanets in Transits and their Atmospheres) will search for Earth-sized planets orbiting nearby red dwarf stars, its three 0.6 m-diameter near-infrared telescopes (pictured) increasing the sensitivity compared to previous searches. ExTrA is a French project also funded by the European Research Council and the telescopes will be operated remotely from Grenoble.

Ancient black hole lights up early universe

Many questions remain about what happened in the first billion years of the universe. At around 100 million years old, the universe was a dark place consisting of mostly neutral hydrogen without many objects emitting detectable radiation. This situation changed as stars and galaxies formed, leading to a phase transition known as reionisation where the neutral hydrogen was ionised. Exactly when reionisation started and how long it took is still not fully clear, but a recent discovery of the oldest massive black hole ever found can help answer this important question.

Up to about 300,000 years after the Big Bang, the universe was hot and dense, and electrons and protons were fully separated. As the universe started to expand, it cooled down and underwent a first phase transition where electrons and protons formed neutral gases such as hydrogen. The following period is known as the cosmic dark ages. During this period, protons and electrons were mostly combined into neutral hydrogen, but the universe had to cool much further before matter could condense to the level where light-producing objects such as stars could form. These new objects started to emit both the radiation we can now detect to study the early universe and also the radiation responsible for the last phase transition – the reionisation of the universe. Some of the brightest and therefore easiest-to-detect objects are quasars: massive black holes surrounded by discs of hot accreting matter that emit radiation over a wide but distinctive spectrum.

Using data from a range of large-area surveys by different telescopes, a group led by Eduardo Bañados from the Carnegie Institution for Science has discovered a distant quasar called J1342+0928, with the black hole at its centre found to be eight million solar masses. After the radiation was emitted by J1342+0928, it travelled through the expanding universe, increasing its wavelength or “red shifting” in proportion to its travel time. Using known spectral features of quasars, the redshift (and therefore the moment at which the radiation was emitted) can be calculated.

The spectrum of J1342+0928, shown in the figure, demonstrates that the universe was only 690 million years old – just 5% of its current age – at the time we see J1342+0928. The spectrum also shows a second interesting feature: the absorption of a part of the spectrum by neutral hydrogen, which implies that at the time we are observing the black hole, the universe was not fully ionised yet. By modelling the emission and absorption, Bañados and co-workers found that the spectrum from J1342+0928 is compatible with emission in a universe where half the hydrogen was ionised, putting the time of emission right in the middle of the epoch of reionisation.

The next mystery is to explain how a black hole weighing eight million solar masses could form so early in the universe. Black holes grow as they accrete mass surrounding them, but the accreting mass radiates and this radiation pushes other accreting mass away from the black hole. As a result, there is a theoretical limit on the amount of matter a black hole can accrete. Forming a black hole the size of J1342+0928 with such accretion limits would require black holes in the very early universe with sizes that challenge current theoretical models. One possible explanation, however, is that this particular black hole is a peculiar case and was formed by a merger of several smaller black holes.

Thanks to continuous data taking from a range of existing telescopes and upcoming new instrumentation, we can expect more objects like J1342+0928 or even older to be discovered, offering a probe of the universe at even earlier stages. The discovery of further objects would allow a more exact date for the period of reionisation, which can be compared with indirect measurements coming from the cosmic microwave background. At the same time, more measurements will show if black holes of this size in the early universe are just an anomaly or if there are more. In either case, such observations would provide important input for research on early black hole formation.

First cosmic-ray results from CALET on the ISS

The CALorimetric Electron Telescope (CALET), a space mission led by the Japan Aerospace Exploration Agency with participation from the Italian Space Agency (ASI) and NASA, has released its first results concerning the nature of high-energy cosmic rays.

Having docked with the International Space Station (ISS) on 25 August 2015, CALET is carrying out a full science programme with long-duration observations of high-energy charged particles and photons coming from space. It is the second high-energy experiment operating on the ISS following the deployment of AMS-02 in 2011. During the summer of 2017 a third experiment, ISS-CREAM, joined these two. Unlike AMS-02, CALET and ISS-CREAM have no magnetic spectrometer and therefore measure the inclusive electron and positron spectrum. CALET’s homogeneus calorimeter is optimised to measure electrons, and one of its main science goals is to measure the detailed shape of the electron spectrum.

Due to the large radiative losses during their travel in space, high-energy cosmic electrons are expected to originate from regions relatively close to Earth (of the order of a few thousand light-years). Yet their origin is still unknown. The shape of the spectrum and the anisotropy in the arrival direction might contain crucial information as to where and how electrons are accelerated. It could also provide a clue on possible signatures of dark matter – for example, the presence of a peak in the spectrum might tell us about a possible dark-matter decay or annihilation with an electron or positron in the final state – and shed light on the intriguing electron and positron spectra reported by AMS-02 (CERN Courier December 2016 p26).

To pinpoint possible spectral features on top of the overall power-law energy dependence of the spectrum, CALET was designed to measure the energy of the incident particle with very high resolution and with a large proton rejection power, well into the TeV energy region. This is provided by a thick homogeneous calorimeter preceded by a high-granularity pre-shower with imaging capabilities with a total thickness of 30 radiation length at normal incidence. The calibration of the two instruments is the key to control the energy scale and this is why CALET – a CERN-recognised experiment – performed several calibration tests at CERN.

The first data from CALET concern a measurement of the inclusive electron and positron spectrum in the energy range from 10 GeV to 3 TeV, based on about 0.7 million candidates (1.3 million in full acceptance). Above an energy of 30 GeV the spectrum can be fitted with a single power law with a spectral index of –3.152±0.016. A possible structure observed above 100 GeV requires further investigation with increased statistics and refined data analysis. Beyond 1 TeV, where a roll-off of the spectrum is expected and low statistics is an issue, electron data are now being carefully analysed to extend the measurement. CALET has been designed to measure electrons up to around 20 TeV and hadrons up to an energy of 1 PeV.

CALET is a powerful space observatory with the ability to identify cosmic nuclei from hydrogen to elements heavier than iron. It also has a dedicated gamma-ray-burst instrument (CGBM) that so far has detected bursts at an average rate of one every 10 days in the energy range of 7 KeV–20 MeV. The search for electromagnetic counterparts of gravitational waves (GWs) detected by the LIGO and Virgo observatories proceeds around the clock thanks to a special collaboration agreement with LIGO and Virgo. Upper limits on X-ray and gamma-ray counterparts of the GW151226 event were published and further research on GW follow-ups is being carried out. Space-weather studies relative to the relativistic electron precipitation (REP) from the Van Allen belts have also been released.

With more than 500 million triggers collected so far and an expected extension of the observation time on the ISS to five years, CALET is likely to produce a wealth of interesting results in the near future.

ATLAS reports direct evidence for Higgs–top coupling

The Higgs boson interacts more strongly with more massive particles, so the coupling between the top quark and the Higgs boson (the top-quark Yukawa coupling) is expected to be large. The coupling can be directly probed by measuring the rate of events in which a Higgs boson is produced in association with a pair of top quarks (ttH production). Using the 13 TeV LHC data set collected in 2015 and 2016, several ATLAS analyses targeting different Higgs boson decay modes were performed. The combination of their results, released in late October, provides the strongest single-experiment evidence to date for ttH production.

The H  bb decay channel offers the largest rate of ttH events, but extracting the signal is hard because of the large background of top quarks produced in association with a pair of bottom quarks. The analysis relies on the identification of b-jets and multivariate analysis techniques to reconstruct the events and determine whether candidates are more likely to arise from ttH production or from background processes.

CCnew8_10_17

The probability for the Higgs boson to decay to a pair of W bosons or a pair of τ leptons is smaller, but the backgrounds to ttH searches with these decays are also smaller and easier to estimate. These decays are targeted in searches for events with a pair of leptons carrying the same charge or three or more charged leptons (including electrons, muons, or hadronically decaying τ leptons). In total, seven different final states were probed in the latest ATLAS analysis.

Higgs boson decays to a pair of photons or to a pair of Z bosons with subsequent decays to lepton pairs (giving a four-lepton final state) are also considered. These decay channels have very small rates, but provide a high signal-to-background ratio.

In the combination of these ttH analyses, an excess with a significance of 4.2 standard deviations with respect to the “no-ttH-signal” hypothesis is observed, compared to 3.8 standard deviations expected for a Standard Model signal. This constitutes the first direct evidence for the ttH process occurring at ATLAS. A cross-section of 590+160–150 fb is measured, in good agreement with the Standard Model prediction of  507+35–50 fb. This measurement, when combined with other Higgs boson production and decay studies, will shed more light on the possible presence of physics beyond the Standard Model in the Higgs sector.

CMS sees Higgs boson decaying to b-quarks

The CMS experiment has added another piece to the Higgs boson puzzle, reporting evidence that the Higgs decays to a pair of b quarks.

In the Standard Model (SM) the Higgs field couples to fermions, giving them their masses, through a Yukawa interaction. The recent CMS observation of the H ττ channel provides direct evidence of this interaction. While it is clear that the Higgs boson couples to up-type quarks (based on overall agreement between the gluon–gluon fusion production channel cross-section and the SM prediction), the Higgs boson decay to bottom quark–antiquark pairs provides a unique tool to directly access the bottom-type quark couplings.

The Higgs boson decays to a pair of b quarks 58% of the time, making it by far the most frequent decay channel. However, at the LHC the signal is overwhelmed by QCD production, which is several orders of magnitude higher. This makes the H  bb process very elusive. The most effective way to observe it is to search for associated production with an electroweak vector boson (VH, with V being a W or a Z boson). Further background reduction is achieved by requiring the Higgs boson candidates to have large transverse momentum and by exploiting the peculiar VH kinematical event properties.

The latest CMS analysis is based on LHC data collected last year at an energy of 13 TeV. To identify jets originating from b quarks, the collaboration used a novel combined multivariate b-tagging algorithm that exploits the presence of soft leptons together with information such as track impact parameters and secondary vertices. A signal region enriched in VH events was then selected, together with several control regions to test the accuracy of the Monte Carlo simulations, and a simultaneous binned-likelihood fit of the signal and control regions used to extract the Higgs boson signal.

An excess of events is observed compared to the expectation in the absence of a H  bb signal. The significance of the excess is 3.3σ, where the expectation from SM Higgs boson production is 2.8σ. The signal strength corresponding to this excess, relative to the SM expectation, is 1.2±0.4. When combined with the Run 1 measurement at a lower energy, the signal significance is 3.8σ with 3.8σ expected and a signal strength of 1.1.

To validate the analysis procedure, the same methodology was used to extract a signal for the VZ process, with Z  bb, which has a nearly identical final state but with a different invariant mass and a larger production cross-section. The observed excess of events for the combined WZ and ZZ processes has a significance of 5σ from the background-only event-yield expectation, and the corresponding signal strength is 1.0±0.2.

Thanks to the outstanding performance of the LHC, the data set will significantly increase by the end of Run 2, in 2018. This will allow a consistent reduction of the uncertainties, and a 5σ observation of the H  bb decay is expected.

Extreme cosmic rays reveal clues to origin

The energy spectrum of cosmic rays continuously bombarding the Earth spans many orders of magnitude, with the highest energy events topping 108 TeV. Where these extreme particles come from, however, has remained a mystery since their discovery more than 50 years ago. Now the Pierre Auger collaboration has published results showing that the arrival direction of ultra-high-energy cosmic rays (UHECRs) is far from uniform, giving a clue to their origins.

The discovery in 1963 at the Vulcano Ranch Experiment of cosmic rays with energies exceeding one million times the energy of the protons in the LHC raised many questions. Not only is the charge of these hadronic particles unknown, but the acceleration mechanisms required to produce UHECRs and the environments that can host these mechanisms are still being debated. Proposed origins include sources in the galactic centre, extreme supernova events, mergers of neutron stars, and extragalactic sources such as blazars. Unlike the case with photons or neutrinos, the arrival direction of charged cosmic rays does not point directly towards their origin because, despite their extreme energies, their paths are deflected by magnetic fields both inside and outside our galaxy. Since the deflection reduces as the energy goes up, however, some UHECRs with the highest energies might still contain information about their arrival direction.

At the Pierre Auger Observatory, cosmic rays are detected using a vast array of detectors spread over an area of 3000 km2 near the town of Malargüe in western Argentina. Like the first cosmic-ray detectors in the 1960s, the array measures the air showers induced as the cosmic rays interact with the atmosphere. The arrival times of the particles, measured with GPS receivers, are used to determine the direction from which the primary particles came within approximately one degree.

The presented dipole measurement is based on a total of 30,000 cosmic rays measured.

The collaboration studied the arrival direction of particles with energies in the range 4-8 EeV and for particles with energies exceeding 8 EeV. In the former data set, no clear anisotropy was observed, whereas for particles with energies above 8 EeV a dipole structure was observed (see figure), indicating that more particles come from a particular part of the sky. Since the maximum of the dipole is outside the galactic plane, the measured anisotropy is consistent with an extragalactic nature. The collaboration reports that the maximum, when taking into account the deflection of magnetic fields, is consistent with a region in the sky known to have a large density of galaxies, supporting the view that UHECRs are produced in other galaxies. The lack of anisotropy at lower energies could be a result of the higher deflection of these particles in the galactic magnetic field.

The presented dipole measurement is based on a total of 30,000 cosmic rays measured by the Pierre Auger Observatory, which is currently being upgraded. Although the results indicate an extragalactic origin, the particular source responsible for accelerating these particles remains unknown. The upgraded observatory will enable more data to be acquired and allow a more detailed investigation of the currently studied energy ranges. It will also open the possibility to explore even higher energies where the magnetic-field deflections become even smaller, making it possible to study the origin of UHECRs, their acceleration mechanism and the magnetic fields that deflect them.

Gravitational waves and the birth of a new science

On 14 September 2015, the world changed for those of us who had spent years preparing for the day when we would detect gravitational waves. Our overarching goal was to directly detect gravitational radiation, finally confirming a prediction made by Albert Einstein in 1916. A year after he had published his theory of general relativity, Einstein predicted the existence of gravitational waves in analogy to electromagnetic waves (i.e. photons) that propagate through space from accelerating electric charges. Gravitational waves are produced by astrophysical accelerations of massive objects, but travel through space as oscillations of space–time itself.

It took 40 years before the theoretical community agreed that gravitational waves are real and an integral part of general relativity. At that point, proving they exist became an experimental problem and experiments using large bars of aluminium were instrumented to detect a tiny change in shape from the passage of a gravitational wave. Following a vigorous worldwide R&D programme, a potentially more sensitive technique – suspended-mass interferometry – has superseded resonant-bar detectors. There was limited theoretical guidance regarding what sensitivity would be required to achieve detections from known astrophysical sources. But various estimates indicated that a strain sensitivity ΔL/L of approximately 10–21 caused by the passage of a gravitational wave would be needed to detect known sources such as binary compact objects (binary black-hole mergers, binary neutron-star systems or binary black-hole neutron-star systems). That’s roughly equivalent to measuring the Earth–Sun separation to a precision of the proton radius.   

The US National Science Foundation approved the construction of the Laser Interferometer Gravitational-Wave Observatory (LIGO) in 1994 at two locations: Hanford in Washington state and Livingston in Louisiana, 3000 km away. At that time, there was a network of cryogenic resonant-bar detectors spread around the world, including one at CERN, but suspended-mass interferometers have the advantage of broadband frequency acceptance (basically the audio band, 10–10,000 Hz) and a factor-1000 longer arms, making it feasible to measure a smaller ΔL/L. Earth-based detectors are sensitive to the most violent events in the universe, such as the merger of compact objects, supernovae and gamma-ray bursts. The detailed interferometric concept and innovations had already been demonstrated during the 1980s and 1990s in a 30 m prototype in Garching, Germany, and a 40 m prototype at Caltech in the US. Nevertheless, these prototype interferometers were at least four orders of magnitude away from the target sensitivity.

Strategic planning

We built a flexible technical infrastructure for LIGO such that it could accommodate a future major upgrade (Advanced LIGO) without rebuilding too much infrastructure. Initial LIGO had mostly used demonstrated technologies to assure technical success, despite the large extrapolation from the prototype interferometers. After completing Initial LIGO construction in about 2000, we undertook an ambitious R&D programme for Advanced LIGO. Over a period of about 10 years, we performed six observational runs with Initial LIGO, each time searching for gravitational waves with improved sensitivity. Between each run, we made improvements, ran again, and eventually reached our Initial LIGO design sensitivity. But, unfortunately, we failed to detect gravitational waves.

We then undertook a major upgrade to Advanced LIGO, which had the goal of improving the sensitivity over Initial LIGO by at least a factor of 10 over the entire frequency range. To accomplish this, we developed a more powerful NdYAG laser system to reduce shot noise at high frequencies, a multiple suspension system and larger test masses to reduce thermal noise in the middle frequencies, and introduced active seismic isolation, which reduced seismic noise at frequencies of around 40 Hz by a factor of 100 (CERN Courier January/February 2017 p34). This was the key to our discovery of our first 30 solar-mass binary black-hole mergers, which are concentrated at low frequencies, two years ago. The increased sensitivity to such events had expanded the volume of the universe searched by a factor of up to 106, enabling a binary black-hole-merger detection coincidence within 6 ms between the Livingston and Hanford sites.

We recorded the last 0.2 seconds of this astrophysical collision: the final merger; coalescence; and “ring-down” phase, constituting the first direct observation of gravitational waves. The waveform was accurately matched by numerical-relativity calculations with a signal-to-noise ratio of 24:1 and a statistical probability easily exceeding 5σ. Beyond confirming Einstein’s prediction, this event represented the first direct observation of black holes, and established that stellar black holes exist in binary systems and that they merge within the lifetime of the universe (CERN Courier January/February 2017 p16). Surprisingly, the two black holes were each about 30 times the mass of the Sun – much heavier than expectations from astrophysics.

Run 2 surprises

Similar to Initial LIGO, we plan to reach Advanced LIGO design sensitivity in steps. After completion of the four-month-long first data run (called O1) in January 2016, we improved the interferometer at the Livingston site from 60 Mpc to 100 Mpc for binary neutron-star mergers, but fell somewhat short in Hanford due to some technical issues, which we decided to fix after LIGO’s second observational run (O2). We have now reported a total of four black-hole-merger events and are beginning to determine characteristics such as mass distributions and spin alignments that will help distinguish between the different possibilities for the origin of such heavy black holes. The leading ideas are that they originate in low-metallicity parts of the universe, were produced in dense clusters, or are primordial. They might even constitute some of the dark matter.   

We recorded the last 0.2 seconds of this astrophysical collision: the final merger.

Advanced LIGO’s O2 run ended in August this year. Although it seemed almost impossible that it could be as exciting as O1, several more black-hole binary mergers have been reported, including one after the Virgo interferometer in Italy joined O2 in August and dramatically improved our ability to locate the direction of the source. In addition, the orientation of Virgo relative to the two LIGO interferometers enabled the first information on the polarisation of the gravitational waves. Together with other measurements, this allowed us to limit the existence of an additional tensor term in general relativity and showed that the LIGO–Virgo event is consistent with the predicted two-state polarisation picture.

Then, on 17 August, we really hit the jackpot: our interferometers detected a neutron-star binary merger for the first time. We observed a coincidence signal in both LIGO and Virgo that had strikingly different properties from the black-hole binary mergers we had spotted earlier. Like those, this event entered our detector at low frequencies and propagated to higher frequencies, but lasted much longer (around 100 s) and reached much higher frequencies. This is because the masses in the binary system were much lower and, in fact, are consistent with being neutron stars. A neutron star results from the collapse of a star into a compact object of between 1.1–1.6 solar masses. We have identified our event as the merger of two neutron stars, each about the size of Geneva, but having several hundred thousand times the mass of the Earth.

As we accumulate more events and improve our ability to record their waveforms, we look forward to studying nuclear physics under these extreme conditions. This latest event was the first observed gravitational-wave transient phenomenon also to have electromagnetic counterparts, representing multi-messenger astronomy. Combining the LIGO and Virgo signals, the source of the event was narrowed down to a location in the sky of about 28 square degrees, and it was soon recognised that the Fermi satellite had detected a gamma-ray burst shortly afterwards in the same region. A large and varied number of astronomical observations followed. The combined set of observations has resulted in an impressive array of new science and papers on gamma-ray bursts, kilonovae, gravitational-wave measurements of the Hubble constant, and more. The result even supports the idea that binary neutron-star collisions are responsible for the very heavy elements, such as platinum and gold.

Going deeper

Much has happened since our first detection, and this portends well for the future of this new field. Both LIGO and Virgo entered into a 15 month shutdown at the end of August to further improve noise levels and raise their laser power. At present, Advanced LIGO is about a factor of two below its design goal (corresponding to a factor of eight in event rates). We anticipate reaching design sensitivity by about 2020, after which the KAGRA interferometer in Japan will join us. A third LIGO interferometer (LIGO-India) is also scheduled for operation in around 2025. These observatories will constitute a network offering good global coverage and will accumulate a large sample of binary merger events, achieve improved pointing accuracy for multi-messenger astronomy, and hopefully will observe other sources of gravitational waves. This will not be the end of the story. Beyond the funded programme, we are developing technologies to improve our instruments beyond Advanced LIGO, including improved optical coatings and cryogenic test masses.

In the longer range, concepts and designs already exist for next-generation interferometers, having typically 10 times better sensitivity than will be achieved in Advanced LIGO and Virgo (see panel on previous page). In Europe, a mature concept called the Einstein Telescope is an underground interferometer facility in a triangular configuration (see panel on previous page), and in the US a very long (approximately 40 km) LIGO-like interferometer is under study. The science case for such next-generation devices is being developed through the Gravitational Wave International Committee (GWIC), which is the gravitational-wave field’s equivalent to the International Committee for Future Accelerators (ICFA) in particle physics. Although the science case appears very strong scientifically and technical solutions seem feasible, these are still very early days and many questions must be resolved before a new generation of detectors is proposed.

To fully exploit the new field of gravitational-wave science, we must go beyond ground-based detectors and into the pristine seismic environment of space, where different gravitational-wave sources will become accessible. As described earlier, the lowest frequencies accessible by Earth-based observatories are about 10 Hz. The Laser Interferometer Space Antenna (LISA), a European Space Agency project scheduled for launch in the early 2030s, was approved earlier this year and will cover frequencies around 10–1–10–4 Hz. LISA will consist of three satellites separated by 2.5 × 106 km in a triangular configuration and a heliocentric orbit, with light travelling continually along each arm to monitor the satellite separations for deviations from a passing gravitational wave. A test mission, LISA Pathfinder, was recently flown and demonstrated the key performance requirements for LISA in space (CERN Courier November 2017 p37).

Meanwhile, pulsar-timing arrays are being implemented to monitor signals from millisecond pulsars, with the goal of detecting low-frequency gravitational waves by studying correlations between pulsar arrival times. The sensitivity range of this technique is 10–6–10–9 Hz, where gravitational waves from massive black-hole binaries in the centres of merging galaxies with periods of months to years could be studied.

An ultimate goal is to study the Big Bang itself. Gravitational waves are not absorbed as they propagate and could potentially probe back to the very earliest times, while photons only take us to within 300,000 or so years after the Big Bang. However, we do not yet have detectors sensitive enough to detect early-universe signals. The imprint also of gravitational waves on the cosmic microwave background has been pursued by the Bicep2 experiment, but background issues so far mask a possible signal.

Although gravitational-wave science is clearly in its infancy, we have already learnt an enormous amount and numerous exciting opportunities lie ahead. These vary from testing general relativity in the strong-field limit to carrying out multi-messenger gravitational-wave astronomy over a wide range of frequencies – as demonstrated by the most recent and stunning observation of a neutron-star merger. Since Galileo first looked into a telescope and saw the moons of Jupiter, we have learnt a huge amount about the universe through modern-day electromagnetic astronomy. Now, we are beginning to look at the universe with a new probe and it does not seem to be much of a stretch to anticipate a rich new era of gravitational-wave science.

CERN LIGO–Virgo meeting weighs up 3G gravitational-wave detectors

Similar to particle physicists, gravitational-wave scientists are contemplating major upgrades to present facilities and developing concepts for next-generation observatories. Present-generation (G2) gravitational-wave detectors – LIGO in Hanford, Livingston and India, Virgo in Italy, GEO600 in Germany and KAGRA in Japan – are in different stages of development and have different capabilities (see main text), but all are making technical improvements to better exploit the science potential from gravitational waves over the coming years. As the network develops, the more accurate location information will enable the long-time dream of studying the same astrophysical event with gravitational waves and their electromagnetic and neutrino counterpart signals.

The case for making future, more sensitive next-generation gravitational-wave detectors is becoming very strong, and technological R&D and design efforts for 3G gravitational detectors may have interesting overlaps with both CERN capabilities and future directions. The 3G concepts have many challenging new features, including: making longer arms; going underground; incorporating squeezed quantum states; developing lower thermal-noise coatings; developing low-noise cryogenics; implementing Newtonian noise cancellation; incorporating adaptive controls; new computing capabilities and strategies; and new data-analysis methods.

In late August, coinciding with the end of the second Advanced LIGO observational run, CERN hosted a LIGO–Virgo collaboration meeting. On the final day, a joint meeting between LIGO–Virgo and CERN explored possible synergies between the two fields. It provided strong motivation for next-generation facilities in both particle and gravitational physics and revealed intriguing overlaps between them. On a practical level, the event identified issues facing both communities, such as geology and survey, vacuum and cryogenics, control systems, computing and governance.

The time for R&D, construction and commissioning is expected to be around a decade, with problems near to intractable. It is planned to use cryogenics to bring mirrors to the temperature of a few kelvin. The mirrors themselves are coated using ion beams for deposition, to obtain a controlled reflectivity that must be uniform over areas 1 m in diameter. These mirrors work in an ultra-high vacuum, and residual gas-density fluctuations must be minimal along a vacuum cavity of several tens of kilometres, which will be the approximate footprint of the 3G scientific infrastructure.

Data storage and analysis is another challenge for both gravitational and particle physicists. Unlike the large experiments at the LHC, which count or measure energy deposition in millions of pixels at the detector level, interferometers continuously sample signals from hundreds of channels, generating a large amount of data consisting of waveforms. Data storage and analysis places major demands on the computing infrastructure, and analysis of the first gravitational events called for the GRID infrastructure.

Interferometers have to be kept on an accurately controlled working point, with mirrors used for gravitational-wave detection positioned and oriented using a feedback control system, without introducing additional noise. Sensors and actuators are different in particle accelerators but the control techniques are similar.

Comparisons of the science capabilities, costs and technical feasibility for the next generation of gravitational-wave observatories are under active discussion, as is the question of how many 3G detectors will be needed worldwide and how similar or different they need be. Finally, there were discussions of how to form and structure a worldwide collaboration for the 3G detectors and how to manage such an ambitious project – similar to the challenge of building the next big particle-physics project after the LHC.

Barry Barish, the author of this feature, shared the 2017 Nobel Prize in Physics with Kip Thorne and Rainer Weiss for the discovery of gravitational waves (CERN Courier November 2017 p37).

bright-rec iop pub iop-science physcis connect