The Antihydrogen TRAP (ATRAP) experiment at CERN’s Antiproton Decelerator has reported a new measurement of the antiproton’s magnetic moment made with an unprecedented uncertainty of 4.4 parts per million (ppm) – a result that is 680 times more precise than previous measurements. The unusual increase in precision results from the experiment’s ability to trap individual protons and antiprotons, as well as from using a large magnetic gradient to gain sensitivity to the tiny magnetic moment.
By applying its single particle approach to the study of antiprotons, the ATRAP experiment has been able make precise measurements of the charge, mass and magnetic moment of the antiproton. Using a Penning trap, the antiproton is suspended at the centre of an iron ring-electrode that is sandwiched between copper electrodes. Thermal contact with liquid helium keeps the electrodes at 4.2 K, providing a nearly perfect vacuum that eliminates the stray matter atoms that could otherwise annihilate the antiproton. Static and oscillating voltages applied to the electrodes allow the antiproton to be manipulated and its properties to be measured.
The result is part of an attempt to understand the matter–antimatter imbalance of the universe. In particular, a comparison of the antiproton’s magnetic moment with that of the proton, tests the Standard Model and its CPT theorem at high precision. The ATRAP team found that the magnetic moments of the antiproton and proton are “exactly opposite”: equal in strength but opposite in direction with respect to the particle spins and consistent with the prediction of the Standard Model and the CPT theorem to 5 parts per million.
However, the potential for much greater measurement precision puts ATRAP in position to test the Standard Model prediction much more stringently. Combining the single particle methods with new quantum methods that make it possible to observe individual antiproton spin flips should make it feasible to compare an antiproton and a proton to 1 part per billion or better.
The “winter” conferences earlier this year saw the LHCb collaboration present three important results from its increasingly precise search for new physics.
One fascinating area of study is the quantum-mechanical process in which neutral mesons such as the D0, B0 and B0s can oscillate between their particle and antiparticle states. The B0s mesons oscillate with by far the highest frequency of about 3 × 1012 times per second, on average about nine times during their lifetime. In an updated study, the collaboration looked at the decays of B0s mesons into D–s π+ with D–s decays reconstructed in five different channels. While the B0s oscillation frequency Δms has been measured before, the oscillations themselves had been previously seen only by folding the decay-time distribution onto itself at the period of the measured oscillation. In this updated analysis the oscillation pattern is spectacularly visible over the full decay-time distribution, as figure 1 shows. The measured value of the oscillation frequency is Δms = 17.768 ± 0.023 ± 0.006 ps–1, which is the most precise in the world (LHCb collaboration 2013a).
CP violation can occur in the B0s sector – in the interference between the oscillation and decay of the meson – but it is expected to be a small effect in the Standard Model. Knowledge of such CP-violating parameters is important because they set the scale of the difference between properties of matter and antimatter; they may also reveal effects of physics beyond the Standard Model. LHCb has previously reported on a study of B0s decays into J/ψ φ and J/ψ π+π– final states, but now the analysis has been finalized. One important improvement is in the flavour tagging, which determines whether the initial state was produced as a B0s or anti-B0s meson. This decision was previously based on “opposite-side” tagging, i.e. from measuring the particle/antiparticle nature of the other b-quark produced in conjunction with the B0s. The collaboration has now achieved improved sensitivity by including “same-side” tagging, from the charge of a kaon produced close to the B0s, as a result of the anti-s-quark produced in conjunction with the B0s. This increases the statistical power of the tagging by about 40%. The values of the CP-violating parameter φs, together with the difference in width of the heavy and light B0s mass states, ΔΓs, are shown in figure 2, which also indicates the small allowed region for these two parameters, corresponding to φs = 0.01 ± 0.07 ± 0.01 rad and ΔΓs = 0.106 ± 0.011 ± 0.007 ps–1 (LHCb collaboration 2013b)
Last, the collaboration has opened a door for important future measurements with a first study of the time-dependent CP-violating asymmetry in hadronic B0s meson decays into a φφ pair, a process that is mediated by a so-called penguin diagram in the Standard Model. Both φ mesons decay in turn into a K+K– pair. The invariant mass spectrum of the four-kaon final state shows a clean signal of about 880 B0s → φφ decays. A first measurement of the CP-violating phase φs for this decay indicates that it lies in the interval of (–2.46, –0.76) rad at 68% confidence level. This is consistent with the small value predicted in the Standard Model, at the level of 16% probability. Although the current precision is limited, this will become a very interesting measurement with the increased statistics from further data taking (LHCb collaboration 2013c)
These results represent the most precise measurements to date, based on data corresponding to the 1 fb–1 of integrated luminosity that LHCb collected in 2011. They are in agreement with the Standard Model predictions and significantly reduces the parameter region in which the signs of new physics can still hide.
In a striking and unexpected observation from new studies aimed at an understanding of the anomalous Y(4260) particle, the international team that operates the Beijing Spectrometer (BESIII) experiment at the Beijing Electron–Positron Collider (BEPCII) has reported that it decays to a new, and perhaps even more mysterious, particle that they have named the Zc(3900).
The Y(4260) has mystified researchers since its discovery by the BaBar collaboration at SLAC in 2005. While other particles with certain similarities have long been successfully explained as bound states of a charmed quark and anticharmed quark, attempts to incorporate the Y(4260) into this model have failed and its underlying nature remains unknown. In December 2012, the BESIII team embarked on a programme to produce large numbers of Y(4260) particles by annihilating electrons and positrons with a total energy tuned to the particle’s mass. Previous studies had used electron–positron collisions at a higher energy, where the Y(4260) mesons were produced via the relatively rare process in which either the original electron or positron particle first radiated a high-energy photon, thereby lowering the total annihilation energy to the mass region of the Y(4260). By contrast, by tuning the beam energies to the particle’s mass, BEPCII can produce the Y(4260) directly and more efficiently. During the first two weeks of the programme, BESIII already collected the world’s largest sample of Y(4260) decays and by the end of the first month there was strong evidence pointing to the existence of the Zc(3900).
The anomalous charmonium particles – such as the Y(4260) and, now, the Zc(3900) – appear to be members of a new class of recently discovered particles. Called the XYZ mesons, they are adding new dimensions to the study of the strong force. QCD, the theory of the strong force, allows more possibilities for charmonium mesons than simply a charmed quark bound to an anticharmed quark. One possibility is that gluons may exist inside mesons in an excited state, a configuration referred to as “hybrid charmonium”. An alternative is that more than just a charmed and anticharmed quark may be bound together to form a “tetraquark” or a molecule-like meson.
Some progress has been made recently in using lattice QCD to account for the existence of the Y(4260) as a state of hybrid charmonium. However, the hybrid picture cannot explain the newly discovered Zc(3900), which decays into a charged pion plus a neutral J/ψ. To decay in this way, the Zc(3900) must contain a charmed quark and an anticharmed quark (to form the J/ψ) together with something that is charged, so therefore cannot be a gluon. To have nonzero charge, the Zc(3900) cannot be a hybrid, but must also contain lighter quarks. Different theoretical models have been proposed that attempt to explain how this could come about. The positively charged Zc(3900) particle could be a tightly bound four-quark composite of a charmed and anticharmed quark pair plus an additional up quark and antidown quark. Or, perhaps, the Zc(3900) is a molecule-like structure comprising two mesons, each of which contain a charmed quark (or anticharmed quark) bound to a lighter antiquark (or quark). Another scenario is that the Zc(3900) is an artefact of the interaction between these two mesons.
Whatever the explanation, the appearance of such an exotic state in the decay of another exotic state was not anticipated by most researchers. Now, the ball is clearly in the experimenters’ courts and there is much hope – by theorists and experimenters alike – that with more data, the veil that continues to shroud these mysterious particles can be lifted.
• The Beijing Spectrometer (BESIII) collaboration has some 350 members from 50 institutions in 11 countries.
The international Borexino collaboration has released results from a new measurement of geoneutrinos corresponding to 1352.60 live days and about 187 tonnes of liquid scintillator after all selection criteria have been applied (3.7 × 1031 proton × year). This corresponds to a 2.4 times higher exposure with respect to the measurement made in 2010.
Borexino is a liquid-scintillator detector built principally underground at INFN Gran Sasso National Laboratory in central Italy to detect solar neutrinos. However, because of its high level of radiopurity – unmatched elsewhere in the world – it can also detect rare events such as the interactions of geoneutrinos. These are electron-antineutrinos that are produced in the decays of long lived radioactive elements (40K, 238U and 232Th) in the Earth’s interior.
From the data collected, 46 electron-antineutrino candidates have been found, about 30% of them geoneutrinos. Borexino has also detected electron-antineutrinos from nuclear power plants around the world. These latter antineutrinos give a signal of about 31 events, which is in good agreement with the number expected from the 446 nuclear cores operating during the period of interest (December 2007 to August 2012) and from current knowledge of the parameters of neutrino oscillations. The total expected background for electron-antineutrinos in Borexino is determined to be about 0.7 events. The small background is a result of the high level of radiopurity of the liquid scintillator. For the current measurement, the null geoneutrino hypothesis has a probability of 6 × 10–6.
The detection of geoneutrinos offers a unique tool to probe uranium and thorium abundances within the mantle. By considering the contribution from the local crust (around the Gran Sasso region) and the rest of the crust to the geoneutrino signal, the signal from the radioactivity of uranium and thorium in the mantle can be extracted. The latest results from Borexino, together with the measurement by the KamLAND experiment in Japan, indicate a signal from the mantle of 14.1±8.1 TNU (1 TNU = 1 event/year/1032 protons).
These new results mark a breakthrough in the comprehension of the origin and thermal evolution of the Earth. The good agreement between the ratios of thorium to uranium determined from geoneutrino signals and the value obtained from chondritic meteorites has fundamental implications for cosmochemical models and the processes of planetary formation in the early Solar System.
By measuring the geoneutrino flux at the surface, the contribution of radioactive elements to the Earth’s heat budget can be explored. The radiogenic heat is of great interest for understanding a number of geophysical processes, such as mantle convection and plate tectonics. For the first time two independent geoneutrino detectors – Borexino and KamLAND, which are placed in different sites around the planet – are providing the same constraints on the radiogenic heat power of the Earth set by the decays of uranium and thorium. With these latest results, the Borexino collaboration finds that the data fit to a possible georeactor with an upper limit on the output power of 4.5 TW at 95% confidence level.
The OPERA experiment at Gran Sasso has observed a third neutrino oscillation, with a muon-neutrino produced at CERN detected as a τ neutrino in the Gran Sasso laboratory. This extremely rare event was observed only twice previously.
OPERA, which is run by an international experiment involving 140 physicists from 28 research institutes in 11 countries, was set up for the specific purpose of discovering neutrino oscillations of this kind. A beam of neutrinos produced at CERN travels towards the INFN Gran Sasso National Laboratory some 730 km away. Thanks to their weak interactions, the neutrinos arrive almost unperturbed at the giant OPERA detector, which consists of more than 4000 tonnes of material, has a volume of some 2000 m3 and contains nine million photographic plates. After the first neutrinos arrived at Gran Sasso in 2006, the experiment gathered data for five consecutive years, from 2008 to 2012. The first τ neutrino was observed in 2010, the second in 2012.
The arrival of the τ neutrino is an important confirmation of the two previous observations. Statistically, the observation of three τ neutrinos enables the collaboration to claim confidently that muon neutrinos oscillate to τ neutrinos. Data analysis is set to continue for another two years.
The long awaited results from ESA’s Planck mission, based on the most detailed observations to date of the cosmic microwave background (CMB), were released on 21 March. While the new data confirm to high precision the standard model of cosmology, the detection of several anomalies could be hints of new physics to be understood.
ESA’s Planck and Herschel missions were launched simultaneously by an Ariane 5 rocket on 14 May 2009 (CERN Courier July/August 2009 p6). Since then, Planck has been scanning the whole sky every six months. After the results on galactic and extragalactic foregrounds (CERN Courier April 2012 p15), the Planck collaboration has now released the CMB results, the prime scientific objective of the mission. The collaboration issued almost 30 publications simultaneously, together with the data from the first half of the mission (15.5 months).
The CMB is a snapshot of the universe when it was 380,000 years old. At that time, the young universe was filled with a hot, dense medium of interacting protons, electrons and photons at about 2700° C. When the protons and electrons combined to form hydrogen atoms, radiation was set free. As the universe expanded, this radiation was stretched to microwave wavelengths, today equivalent to a temperature of just 2.7° C above absolute zero. The CMB is extremely uniform all over the sky. There are only tiny temperature fluctuations (at a level of 10–5) that correspond to regions of slightly different densities at very early times. Gravity will have acted to increase these fluctuations to form the galaxies and galaxy clusters that are seen today.
The fluctuations are of different amplitude on different angular scales. This is described by the power spectrum derived from the all-sky map of the CMB. The observed shape of the power spectrum can then be fitted by a model curve, whose shape is controlled by a set of cosmological parameters. There are only six free parameters for the standard model of a flat universe with cold dark matter and a cosmological constant, ΛCDM. Possible deviations from a pure ΛCDM cosmology can be tested by freeing additional parameters of the model. All attempts to search for deviations in the Planck data have proved insignificant. The main result of Planck is thus a remarkable confirmation of the standard ΛCDM model of the universe.
Compared with NASA’s Wilkinson Microwave Anisotropy Probe (WMAP) satellite, Planck has a much higher sensitivity, a smaller angular scale and a larger spectral coverage, with nine bands instead of five. Yet despite this, Planck has not been able to change fundamentally the view of the cosmos as derived by WMAP (CERN Courier May 2006 p12, May 2008 p8). The updated energy-density content of the present universe consists of slightly higher fractions of ordinary, baryonic matter (4.9% instead of 4.5%) and of dark matter (26.8% instead of 22.7%), compensated by a decrease in the fraction of dark energy (68.3% instead of 72.8%). Planck has also confirmed the existence of some large-scale anomalies seen by WMAP, such as a lack of power in fluctuations at large angular scales, a small asymmetry on both sides of the ecliptic plane and the WMAP cold spot (CERN Courier October 2007 p13). Planck shows that these anomalies are, indeed, of cosmic origin but they are at a level still marginally compatible (2–3σ) with statistical variations on the sky.
The main highlights of the Planck results are constraints on the number and mass of relativistic neutrinos (Neff = 3.30±0.27 and Σmν < 0.66 eV),>NL = 2.7±5.8) and constraints on inflation models (ns = 0.96±0.01 and r < 0.11>8 = 0.82±0.02) and from galaxy clusters (σ8 = 0.77±0.02). Possibly the most unexpected result is a precise determination of the famous Hubble constant, which describes the rate of expansion of the universe, at a significantly lower value (H0 = 67.9±1.5 km/s/Mpc) than derived by other means. This was one of the prime objectives of the Hubble Space Telescope; now it is Planck that makes the most precise determination so far. The next milestone for Planck will be in 2014 with the release of the final products for the complete mission, including the polarization measurements. There is still potential for more exotic discoveries.
After more than three years of highly successful operation, the ALICE detector is about to undergo a major programme of consolidation and upgrade during the Long Shutdown 1 (LS1) of CERN’s accelerator complex. This follows an intense first running period characterized by the continuous record-breaking performance of the LHC. While the shutdown provides time to take stock of the wealth of data collected, the ongoing analysis, the busy programme of work in the experiment’s cavern at Point 2 and the planning for future upgrades will ensure that everyone in the collaboration is kept busy.
The ALICE detector is specially designed for heavy-ion collisions, which are foreseen as part of the LHC programme for four weeks a year. The LHC delivered an integrated lead–lead luminosity of 150 μb–1 during heavy-ion periods in 2010 and 2011, as well 30 nb–1 of proton–lead luminosity in 2013. Together with data collected during normal proton–proton running, as well as in a dedicated five-day proton–proton run in 2011 at the equivalent lead-nucleon energy, these three data sets have provided an excellent basis for an in-depth look at the physics of quark–gluon plasma. With the recent successful conclusion of the proton–lead programme in particular, where the LHC and injectors once again showed their amazing capabilities, the physics-analysis teams in ALICE are certainly not on standby but are more active and excited than ever.
Down the cavern
As soon as LHC beam operations ended on 14 February, the occupation of the car park at the ALICE experimental site began to rise sharply, indicating the start of the major shutdown activities. (Long-term observations have shown that there is good proportionality between the number of parked cars and activities in the cavern.) The first of these, as in any ALICE shutdown, concerns the removal of hundreds of tonnes of shielding blocks from the access shaft and the cavern. This is to allow the opening of the large doors of the solenoid magnet and give access to the ALICE detector. This sequence is now well established because even during the short winter stops of 2010/2011 and 2011/2012, the ALICE detector was opened for installation of the electromagnetic calorimeter (EMCAL) and modules of the transition radiation tracker (TRD).
So what are the major plans for ALICE during LS1? The main activity on the detector will be concerned with the installation of the dijet calorimeter (DCAL), an extension of the existing EMCAL system that adds 60° of azimuthal acceptance opposite the existing 120° of the EMCAL’s acceptance. This new subdetector will be installed on the bottom of the solenoid magnet, which currently houses three modules of the photon spectrometer (PHOS). An entirely new rail system and cradle will be installed to support the three PHOS modules and eight DCAL modules, which together weigh more than 100 tonnes.
The removal of the present structures and the installation of the new services, support structures and then the DCAL and PHOS modules will take up most of this year. The installation of five modules of the TRD will follow and so complete this complex detector system, which consists of 18 units. This work is complicated by the installation path being obstructed by major support-structures for services that will have to be temporarily supported by different structures.
In addition to these mainstream detector activities, all of the 18 ALICE subdetectors will undergo major improvements and consolidation efforts during LS1. The computers and discs of the online systems have reached their end of life and will also have to be replaced, followed by upgrades of the operating systems and online software.
A major part – indeed, most – of the shutdown cost and human resources will go into the consolidation and upgrade of the ALICE infrastructure. The four levels of ALICE counting rooms, which house the data-acquisition, high-level trigger, detector control-system and most of the detector read-out electronics, have an electrical infrastructure that was installed during the times of the Large-Electron–Positron collider and is now outdated. The renewal of this infrastructure and the installation of a new and significantly more powerful uninterruptible power-supply system form a key element in ensuring the correct operation of ALICE after LS1.
Major safety systems will also have to be installed during LS1. An area of racks under the large dipole magnet, which is inaccessible in the event of a fire, will be equipped with a CO2 extinguishing system and the entire volume inside the solenoid magnet will be equipped with a nitrogen extinguishing system.
The production of chilled water will also undergo a major upgrade as a result of increased demands on cooling and ventilation for ALICE and the LHC. The need for doubling the cooling air-flow inside the solenoid magnet to 10,000 m3/h requires the addition of a new ventilation machine and large ventilation ducts from the surface to the cavern.
The shutdown activities have all been formulated in work packages, analysed for safety aspects and scheduled in detail. In addition, the extraction of LHC magnets through the ALICE shaft, as well as a large number of visitor groups that will come to see the experiment, will pose a big challenge to day-to-day planning for LS1.
All of these efforts will ensure that ALICE is in good shape for the three-year LHC running period after LS1, when the collaboration looks forward to heavy-ion collisions at the top LHC energy of 5.5 TeV/nucleon at luminosities in excess of 1027 Hz/cm2.
However, the LS1 efforts go beyond the hardware activities that are currently under way. The ALICE collaboration has plans for a major upgrade during the next long shutdown, LS2, currently scheduled for 2018. Then the entire silicon tracker will be replaced by a monolithic-pixel tracker system; the time-projection chamber will be upgraded with gaseous electron-multiplier (GEM) detectors for continuous read-out; and all of the other subdetectors and the online systems will prepare for a 100-fold increase in the number of events written to tape. With only five years to go before this major upgrade, the ALICE collaboration is also busy on this front, preparing technical design reports for submission later this year.
With a fantastic set of data already in hand, well prepared activities for LS1 underway and the prospect of a major upgrade during LS2, the ALICE collaboration is in good health and is pursuing with unwavering enthusiasm its exploration of the mysteries of the QCD phase transitions, in a scientific programme that will extend well into the next decade.
In the history of particle physics, July 2012 will feature prominently as the date when the ATLAS and CMS collaborations announced that they had discovered a new particle with a mass near 125 GeV in studies of proton–proton collisions at the LHC. The discovery followed just over a year of dedicated searches for the Higgs boson, the particle linked to the Brout-Englert-Higgs mechanism that endows elementary particles with mass. At this early stage, the phrase “Higgs-like boson” was the recognized shorthand for a boson whose properties were yet to be fully investigated. The outstanding performance of the LHC in the second half of 2012 delivered four times as much data at 8 TeV in the centre of mass as were used in the “discovery” analyses. Thus equipped, the experiments were able to present new results at the 2013 Rencontres de Moriond in March, giving the particle-physics community enough evidence to name this new boson “a Higgs boson”.
At the Moriond meeting, in addition to a suite of final results from the experiments at Fermilab’s Tevatron on the same subject, the ATLAS and CMS collaborations presented preliminary new results that further elucidate the nature of the particle discovered just eight months earlier. The collaborations find that the new particle is looking more and more like a Higgs boson. However, it remains an open question whether this is the Higgs boson of the Standard Model of particle physics, or one of several such bosons predicted in theories that go beyond the Standard Model. Finding the answer to this question will require more time and data.
This brief summary provides an update of the measurements of the properties of the newly discovered boson using, in most cases, the full proton–proton collision data sample recorded by the ATLAS and CMS experiments in 2011 and 2012 for the H→γγ, H→ZZ(*)→4l, H→WW(*)→lνlν, H→τ+τ– and H→bb channels, corresponding to integrated luminosities of up to 5 fb–1 at √s = 7 TeV and up to 21 fb–1 at √s = 8 TeV. In the intervening time, CMS and ATLAS have also developed searches for rarer decays – such as H→Zγ or H→μ+μ– – and for invisible or undetectable decays expected in theories beyond the Standard Model.
Whether or not the new particle is a Higgs boson is demonstrated by how it interacts with other particles, as well as by its own quantum properties. For example, a Higgs boson is postulated to have no spin and in the Standard Model its parity – a measure of how its mirror image behaves – should be positive. ATLAS and CMS have compared a number of alternative spin-parity (JP) assignments for this particle and, in pairwise hypothesis tests, the hypothesis of zero spin and positive parity (0+) is consistently favoured, as summarized in Table 1.
In CMS, the presence of a signal has been established in each of several expected decay channels. The H→γγ and H→ZZ(*)→4l channels point to a mass between 125.4 GeV and 125.8 GeV. For mH = 125 GeV, an excess of 4.1σ is observed in the H→WW(*)→lνlν channel and there are remarkable positive results in the decays to b quarks (2.2σ) and τ leptons (2.9σ), an important hint that this Higgs boson also couples to fermions. As expected in the Standard Model, the search for H→Zγ has not yielded a signal – nevertheless constraining the possibilities of models beyond the Standard Model.
Apart from exploiting the larger set of 8 TeV data, the CMS analyses have benefited from many improvements since the discovery announcement, from revised calibration constants to more sensitive analysis methods. In the H→γγ and H→ZZ(*)→4l channels, the largest difference is in the use of event classes with specific topologies to exploit the associated production modes. In these channels, the mass measurement has also benefited from improved energy and momentum resolution. Figure 1 shows the data entering the H→ZZ(*)→4l analysis and it gives a sense of how individual events build up to a 6.7σ excess of events and how their mass resolution (also depicted) allows a measurement of the mass of the new boson at 125.8 ± 0.5(stat.) ± 0.2(syst.) GeV, the precision being dominated by statistics and already better than 0.5%. This mass measurement is in remarkable agreement with the value of 125.4 ± 0.5(stat.) ± 0.6(syst.) GeV measured in the H→γγ channel, where the excess has a significance of 3.2σ. Updated CMS analysis of H→γγ, which takes advantage of the improved detector calibration, yields a result close to that expected for the Standard Model Higgs boson in terms of signal strength, μ = σ/σSM = 0.78+0.28–0.26
In figure 2, an overview of the main decays studied in CMS shows how evidence for a Higgs boson can be seen in each channel with individual significances ranging from 2.2σ to 6.7σ. With respect to the results presented by CMS last July, there are slight differences in the individual signal strengths: smaller in the H→γγ channel and larger in the H→bb and H→τ+τ– channels. These results strongly indicate that it is a Higgs boson. Overall, the results continue to be fully compatible with the expectation for a Standard Model Higgs boson, while within the current uncertainties many scenarios of physics beyond the Standard Model are still allowed.
For ATLAS, the combined signal strength for H→γγ, H→ZZ(*)→4l, H→WW(*)→lνlν and H→τ+τ– has been determined to be μ = 1.30 ± 0.13(stat.) ± 0.14(syst.) at the new mass measurement of 125.5 ± 0.2(stat.)+0.5–0.6(syst.) GeV. The collaboration has also measured the ratio of the cross-sections for vector-boson mediated and (predominantly) gluon-initiated processes for producing a Higgs boson, as shown in figure 3. Measurements of relative branching-fraction ratios between the H→γγ, H→ZZ(*)→4l and H→WW(*)→lνlν channels, as well as combined fits testing the fermion and vector coupling sector, couplings to W and Z and loop-induced processes of the Higgs-like boson, show no significant deviation from the Standard Model expectation, as figure 4 shows.
Figure 3 compares a summary of the combined results for Higgs production to the Standard Model expectation and demonstrates an overall consistency. Here, a common signal-strength scale factor, ggF+ttH, has been assigned to the gluon-fusion (ggF) and the small ttH production mode because they both scale predominantly with the Yukawa coupling of the top quark in the Standard Model. For the combination, vector-boson-fusion-like events and gluon-fusion-like events are distinguished within the individual analyses based on the kinematic properties of the event. The combined measured ratio of production scaling factors, μVBF/μggF+ttH = 1.2+0.7–0.5, driven by the H→γγ channel measurement, gives more than 3σ evidence for Higgs-boson production through vector-boson fusion.
Having demonstrated overall consistency in terms of production, five tests of the observed coupling scale factors are summarized in figure 4. This shows the overall consistency with the Standard Model hypothesis and places limits on various model extensions for the produced Higgs boson. These tests are implemented according to recommendations from the Higgs Cross-Section Working Group. The ATLAS results assume a single, narrow CP-even Higgs resonance at mH = 125.5 GeV with coupling strengths that may depart from the Standard Model in various prescribed ways. For example, the relative vector boson and fermion coupling strengths (labelled κV, κF) are allowed to vary, giving the experimental constraints on the relative deviation of these quantities shown in the upper section of figure 4. The current results are not powerful enough to resolve the ambiguity in the relative sign of κV and κF. Considering κV > 0, κF has a double minimum leading to the observed structure in the intervals allowed by the data. Figure 4 also shows the results for other benchmark parameterizations where no assumption is made on the total width for the fermion-to-boson coupling-strength ratio (labelled λFV) and where the ratio of W-to-Z couplings is tested (labelled λWZ). Scenarios for physics beyond the Standard Model contributions via loops (labelled κg, κγ) and via invisible or undetectable decays (labelled Bi,u) can similarly be compared with the intervals allowed by the data.
Using the latest data, alternative hypotheses have been tested but none of them is found to be preferred over the Standard Model.
After eight months, and thanks to the extraordinary performance of the LHC, the ATLAS and CMS collaborations have revealed more of the true nature of a new boson that is unique in the Standard Model. The more detailed picture that ATLAS and CMS have put together on this newborn boson since July 2012 remains unfailingly consistent with expectations drawn from the Standard Model, with the spin, parity, relative couplings, production and decay mechanisms all consistent at the current level of precision. Using the latest data, alternative hypotheses have been tested but none of them is found to be preferred over the Standard Model; rare decays have been searched for but, as expected in the Standard Model, no evidence for a signal has been found. The more similar this Higgs boson is to the Standard Model expectation, the more time, data and ingenuity will be required in the analyses of the LHC data to provide hints of physics at work beyond the Standard Model. Ultimately, upgraded and new accelerators will be needed to understand the interactions of the Higgs boson at a deeper level but for now it is clear that this boson is a precious thread with which we can hope to unravel more of the remaining mysteries of the universe.
In the autumn of 1982, I was invited to give a series of seven lectures at CERN under the title “Electroweak Interactions”. These were part of the Academic Training Programme, which was aimed at young experimenters working on projects at CERN. The lectures were to be given on successive days on 18–26 November, excluding the weekend of 21–22 November. I had given seminars at CERN on earlier occasions and the response had always been positive. Giving seven lectures in a row could be stressful but at least the subject was in my own domain. I expected the number of people attending to be between 50 and 100.
When I arrived to give my lecture on the first day, I was astonished to see that the auditorium was chock-full of people. (Somebody mentioned later that the number was 400.) For a moment I thought that I had wandered into the wrong auditorium. Seated in the first row were stalwarts of CERN, such as Rolf Hagedorn, Jacques Prentki, Maurice Jacob and André Martin. I could see in the crowd several experienced people whom I knew from the heyday of neutrino physics. It was not at all the kind of audience that I had expected. I began to wonder what I could tell them that they had not heard a dozen times before.
A bold venture
When the opening lecture ended I hastened to return to the dormitory to prepare my second talk. On the way I saw Jack Steinberger, one of the veterans of CERN, for whose course I had once acted as a tutor. I told him that I had come to CERN to give Academic Training lectures and he said, with dismay: “I know that. I looked for my people this morning and there was nobody around, because they had all gone to your lecture.”
That evening I went to the CERN cafeteria for a coffee and there I saw something that I had not noticed before. There was a monitor on the wall and people were watching the screen with great interest. The monitor was showing the rate of proton–antiproton collisions in CERN’s latest challenge – a bold venture designed to produce the intermediate bosons, W and Z. These bosons were predicted by electroweak theory to occur at masses of 80 GeV and 90 GeV, respectively. The synchrotron at CERN that accelerated protons to 400 GeV was, by itself, not capable of producing such massive particles. So CERN had built a smaller ring in which antiprotons produced in conventional proton interactions were accumulated. These antiprotons were compressed to compact beams, then accelerated to 270 GeV in the Super Proton Synchrotron and finally brought into head-on collision with 270 GeV protons. And this audacious idea appeared to be working! The collision rate was low but it was climbing from hour to hour. Now I understood the reason for the crowd in my lecture. CERN was on the way to testing the crucial prediction of electroweak theory, namely the existence of intermediate bosons with masses and properties that were precisely predicted. A confirmation of this prediction would be a triumph for CERN and would probably bring the laboratory its first Nobel prize.
I returned to my room in the dormitory and resumed the writing of my overhead transparencies. I now knew that my lectures would have to focus on precisely the questions that the physicists at CERN would be interested in: the cross-sections for W and Z production; the expected event rates; the angular distribution of the W and Z decay products, etc. People would also want to know how uncertain the predictions for the W and Z masses were and why certain theorists (J J Sakurai and James Bjorken among them) were cautioning that the masses could turn out to be different. The writing of the transparencies turned out to be time consuming. I had to make frequent revisions, trying to anticipate what questions might be asked. To make corrections on the film transparencies, I was using my after-shave lotion, so that the whole room was reeking of perfume. I was preparing the lectures on a day-by-day basis, not getting much sleep. To stay awake, I would go to the cafeteria for a coffee shortly before it closed. Thereafter I would keep going to the vending machines in the basement for chocolate – until the machines ran out of chocolate or I ran out of coins.
After the fourth lecture, the room in the dormitory had become such a mess (papers everywhere and the strong smell of after-shave) that I decided to ask the secretariat for an office where I could work. Office space in CERN is always scarce but they said I could use the office that was previously occupied by Sakurai. At that point I recalled, with sorrow, his tragic and totally unexpected death that I had read about some weeks earlier. I had forgotten that he was a visitor at CERN at the time. I had high regard for him as a physicist. There was a period of some years when we were doing parallel things in connection with the structure of neutral currents. He was always fair and correct in attributing credit and was an excellent lecturer. I had met him quite recently at the Neutrino ’82 Conference in Balatonfüred and at the 1982 International Conference on High-Energy Physics in Paris. When the secretary opened the office for me, many of Sakurai’s books and papers were still in the room. Lying on his desk were a couple of preprints that he had been reading on his last day at the office. I felt uncomfortable about disturbing that scene by bringing in my own papers and I told the secretary that I would continue to work in the dormitory.
Champagne times
The lectures went well. The attendance declined after I had finished with the discussion of intermediate bosons (vector quanta) and Higgs particles (scalar quanta). On the eve of the last lecture, I went rather late to the CERN cafeteria for dinner. The place was almost deserted. I saw that there was one corner that had been screened off for a private get-together. There were sounds of a party, with clinking glasses and the pop of a champagne bottle. Glancing inside the screen, I saw Steinberger and a number of American visitors at CERN. I realised that it was Thursday and they were celebrating Thanksgiving. For a moment I had a desire to join them but my natural diffidence held me back. As I was about to leave, one person emerged from the enclosure. It was Gary Feldman from SLAC. He greeted me and said: ” I have been attending your lectures. What are you going to talk about tomorrow?” When I said CP violation he said: “What a shame. I should have loved to hear that but I have to leave in the morning.” He wished me luck.
Before leaving the cafeteria, I glanced at the monitor showing the status of the beams in the collider. The luminosity was still rising. The next morning, after my final lecture, I went over to the analysis room of the UA1 experiment in which physicists from Aachen were participating. They showed me a couple of events that were candidates for the W and Z. It seemed that CERN would have occasion to open champagne bottles, before too long.
I returned to Aachen quite exhausted. I resolved not to give so many lectures again (they had asked for only four/five). I also resolved not to use after-shave as a correcting fluid. But it had been a satisfying visit. I had come to CERN at a time full of suspense. There was a scent of discovery in the air.
On 25 January 1983, eight weeks after my return, CERN held a press conference to announce the discovery of the W boson. The announcement of the Z boson followed on 1 June
In February 1981, the Proton Synchrotron received and accelerated antiprotons from the Antiproton Accumulator, thus becoming the world’s first Antiproton Synchrotron. On 7 July, transfer to the Super Proton Synchrotron, acceleration and brief storage at 270 GeV were achieved. Carlo Rubbia delayed his departure to the Lisbon High Energy Physics Conference by a day so that on 10 July he was able to announce that the UA1 detector had seen its first proton–antiproton collisions. There were runs at modest intensities in the second half of the year and the first visual records of the collisions came from another experiment (UA5) using large streamer chambers. UA5 was then moved out to make way for UA2, which took its first data in December.
In 1982, an accident to UA1 forced a concentration of the scheduled proton–antiproton running into a single two-month period at the end of the year (October to December). In terms of operating efficiency, it proved a blessing in disguise and research director Erwin Gabathuler happily sacrificed a crate of champagne to the machine-operating crews as the collision rate was taken to 10 times that of the year before. This was the historic run in which the W particles were first observed.
It was astonishing how fast physics results were pulled from the data accumulated up to 6 December 1982. At a Topical Workshop on Proton-Antiproton Collider Physics held in Rome from 12–14 January 1983, the first tentative evidence for observation of the W particle by the UA1 and UA2 collaborations was there. Out of the several thousand-million collisions that had been seen, a tiny handful gave signals that could correspond to the production of a W in the high-energy collision and its subsequent decay into an electron (or positron if the W was positively charged) and a neutrino. The detectors were programmed to look for high-energy electrons coming out at a relatively large angle to the beam direction. Also, energy imbalance of the particles around a decay indicated the emergence of a neutrino, which itself cannot be detected in the experimental apparatus.
The tension at CERN became electric, culminating in two brilliant seminars, from Carlo Rubbia (for UA1) on Thursday 20 January and Luigi Di Lella (for UA2) the following afternoon, both with the CERN auditorium packed to the roof. UA1 announced six candidate W events; UA2 announced four. The presentations were still tentative and qualified. However, over the weekend of 22–23 January, Rubbia became more and more convinced. As he put it, “They look like Ws, they feel like Ws, they smell like Ws, they must be Ws”. And, on 25 January, a press conference was called to announce the discovery of the W. The UA2 team reserved judgement at this stage but further analysis convinced them also. What was even more impressive was that both teams could already give estimates of mass in excellent agreement with the predictions (about 80 GeV) of the electroweak theory.
It was always clear that the Z would take longer to find. The theory estimated its production rate to be some 10 times lower than that of the Ws. It implied that the machine physicists had to push their collision rates still higher, and this they did in style in the second historic proton–antiproton run from April to July 1983. They exceeded by 50% the challenging goal that had been set and this time it was director-general Herwig Schopper who forfeited a crate of champagne.
Again there was tension as the run began because the Z did not seem keen to show itself. Although more difficult to produce than the W, its signature is easier to spot because it can decay into an electron–positron pair or a muon pair. Two such high-energy particles flying out in opposite directions were no problem for detectors and data-handling systems that had so cleverly unearthed the W.
On 4 May, when analysing the collisions recorded in the UA1 detector a few days earlier, on 30 April, the characteristic signal of two opposite high-energy tracks was seen. Herwig Schopper reported the event at the “Science for Peace” meeting in San Remo on 5 May. However, the event was not a clean example of a particle–antiparticle pair and it was only after three more events had turned up in the course of the month that CERN went public, announcing the discovery of the Z to the press on 1 June. Again, the mass (near 90 GeV) looked bang in line with theory. Just after the run, Pierre Darriulat was able to announce in July that UA2 had also seen at least four good Z decays.
In addition to the Ws and Zs, the observed behaviour was everything that the electroweak theory predicted. Two independent experiments had confirmed a theory of breathtaking imagination and insight.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.