Bluefors – leaderboard other pages

Topics

The Oskar Klein Memorial Lectures 1988–1999

By Gösta Ekspong (ed.)
World Scientific
Hardback: £45
E-book: £34

CCboo2_06_15

Perhaps every reader of CERN Courier has heard about the Klein–Gordon equation, the Klein–Nishina (Compton effect) cross-section, the Klein paradox and the Kaluza–Klein compactified five-dimensional unified theory of gravity, electricity and magnetism. However, few will know about the scientist, Oskar Klein (1894–1977), the pre-eminent and visionary Swedish theoretical physicist from Stockholm whose work continues to influence us to this day.

This book is needed. The reason is described eloquently in the contribution by Alan Guth, whose words I paraphrase: how many recognize Oskar as the first name of “this” Klein? Compare here (by birth year, within 10 years): Niels B (1885), Hermann W (1885), Erwin S (1887), Satyendra N B (1894), Wolfgang P (1900), Enrico F (1901), Werner H (1901), Paul A M D (1902), Eugene W (1902), Robert O (1904). Thanks to this book, Oskar K (1894) will take his place on this short list.

Part of the book collects together all of the Oskar Klein Memorial Lectures given since the series began at Stockholm University in 1988, through to 1999, by many well-known theoreticians, from Chen Ning Yang to Gerard ’t Hooft. Some of these lectures relate to Klein because he often happened to “be there” at the beginning of a new field in physics. For example, in early 1948, Klein recognized immediately, following the disambiguation of the pion and muon, that muon decay and common beta decay can be described by the same four-fermion interaction (see the contribution by T D Lee).

The other part of the book – a third of the 450 pages – is a biographical collection about Klein and his pivotal scientific articles (about a fifth of the volume), all presented in English, although Klein published in Danish, French, English, German and Swedish, as a check of the titles in his publication list reveals. Having Klein’s important work all in one place can lead to interesting insights: for me, finding that 24 December 1928 was a special birthday.

On this day, just eight weeks after the Klein–Nishina paper on the interaction of radiation with electrons, the paper on the Klein paradox reached the editors of Zeitschrift für Physics. Klein concludes: “…(the) difficulty of the relativistic quantum mechanics emphasized by Dirac can appear already in purely mechanical problems where no radiation processes are involved.” The yet-to-be-recognized and discovered antiparticle – the positron – was the “difficulty”, allowing for both radiative and field-instigated pair production (the “paradox”), when vacuum instability is inherent in a prescribed external field configuration.

The Klein-paradox result resurfaced soon in the work by Werner Heisenberg and Hans Euler, and Julian Schwinger on the vacuum properties of QED. Today, as we head towards the centenary of the Klein paradox, pair production in strong fields is being addressed as a priority within the large community interested in ultra-intense laser pulses.

Oskar Klein was always a colleague I wished I could meet, and finally, I have. Thank you, Gösta Ekspong, for this introduction to my new-found hero. While at first my profound personal interest in this book arose from curiosity originating from many years of working out the consequences of the Klein paradox in heavy-ion collisions, I now see how Klein can serve as a role model. This is the book to own for anyone interested in seeing further by “standing on the shoulders of giants”.

LHCf makes the most of a special run

Run 2 of the LHC may only just have officially begun, but the Large Hadron Collider forward (LHCf) experiment has already completed its data taking with proton–proton collisions at the new high energy of 13 TeV in the centre of mass. The experiment collected data in a special physics run carried out on 8–12 June, just after the start of Run 2.

The motivation of LHCf is to understand the hadronic interactions taking place when high-energy cosmic rays collide with the Earth’s atmosphere, producing bunches of particles known as air showers. These air showers allow the observation of primary cosmic rays with energies from 1015 eV to beyond 1020 eV. Because a collision energy of 13 TeV corresponds to the interaction of a proton with an energy of 9 × 1016 eV hitting the atmosphere, the LHC enables an excellent test of what happens at the energy of the observed air showers.

The interaction relevant to air-shower development has a large cross-section, with most of the energy going into producing particles that are emitted in the very forward direction – that is, at very small angles to the direction of the incident particle. LHCf therefore uses two detectors, Arm 1 and Arm 2, installed at 140 m on either side of the interaction point in the ATLAS experiment (CERN Courier January/February 2015 p6).

For LHCf to be able to determine the production angle of individual particles, the experiment requires beams that are more parallel than in the usual LHC collisions. In addition, the probability for more than one collision in a single bunch crossing (pile-up) must be far smaller than unity, to avoid contamination from multiple interaction events. To meet these constraints, for the special run the beams were “unsqueezed” instead of being “squeezed”, making them larger at the collision points. This involved adjusting magnets on either side of the interaction point to increase β* – the parameter that characterizes the machine optics for the squeeze – to a value of 19 m. In addition, the collisions took place either with low beam intensities or with beams offset to each other to reduce pile-up.

The first collisions for physics (“stable beams”) were provided at midnight on 10 June with very low pile-up, followed until noon on 13 June by a total of six machine fills providing various pile-up values ranging from 0.003 to 0.05. This allowed LHCf to take more than 32 hours of physics data, as scheduled.

Even with a luminosity of 1029 cm–2 s–1 – five orders of magnitude below the nominal LHC luminosity – the LHCf detectors achieved a useful data rate of > 500 Hz, recording about 15% of inelastic interactions with neutral particles of energies > 100 GeV. A preliminary analysis during the run showed the clear detection not only of π0 mesons but also of η mesons, which had not been the case with the data at the collision energy of 7 TeV in Run 1.

A highlight of the operation was collaboration with the ATLAS experiment. During the special run, trigger signals in LHCf were sent to ATLAS, which recorded data accordingly. The analyses of such common events will enable the classification of events based on the nature of processes such as diffractive dissociation and non-diffractive interactions.

The LHCf detectors were removed from the LHC tunnel on 15 June during the first technical stop of the LHC, to avoid the radiation damage that would occur with the increasingly high luminosity for Run 2.

Stable beams at 13 TeV

At 10.40 a.m. on 3 June, the LHC operators declared “stable beams”, signalling the official start of Run 2, see “Stable beams at 13 TeV”. The LHC experiments are now ready to take data at the unprecedented collision energy of 13 TeV and, as this page shows, LHCf has already collected all of the data it requires.

Fifth event signals discovery of ντ appearance

OPERA – the Oscillation Project with Emulsion-tRacking Apparatus experiment at the INFN Gran Sasso Laboratory – has detected the fifth occurrence of a tau neutrino (ντ). Setting out from CERN as a muon neutrino (νμ), the particle was detected at Gran Sasso as a ντ after travelling 730 km through the Earth. This detection of a fifth ντ firmly establishes the direct observation of the transition from νμ to ντ, with a statistical precision of 5σ, the now standard threshold for a discovery in particle physics.

The international OPERA experiment, which involves about 140 physicists from 26 research institutes in 11 countries, was designed to observe this exceptionally rare phenomenon, gathering data in the neutrino beam produced by the CERN Neutrinos to Gran Sasso (CNGS) project (CERN Courier November 2006 p24). A small fraction of the incoming neutrinos interacted with the giant detector, consisting of more than 4000 tonnes of material, with a volume of some 2000 m3 and some nine million photographic plates, to produce the particles observed. After detecting the first few νμ produced at CERN in 2006, the experiment has collected data for five years, from 2008 to the end of 2012. The first ντ was observed in 2010. The second and third events were reported in 2012 and 2013, respectively, while the fourth one was announced in 2014 (CERN Courier May 2014 p9).

The OPERA collaboration will continue to analyse the data collected, searching for other νμ to ντ transitions, and possibly also measure the oscillation parameters, for the first time using oscillated ντ.

HL-LHC begins the move from paper to hardware

The design study for the High-Luminosity LHC (HL-LHC) project is now approaching completion. The conceptual design is completed for most of the magnets, engineering is in progress, and the first hardware that will be used in the prototypes is being manufactured and tested. Recent months have seen successful tests of some of the magnets that will be essential for this high-luminosity upgrade (CERN Courier March 2015 p28).

The interaction regions of the HL-LHC will contain nine different types of new magnets, relying on three different technologies – Nb3Sn and Nb-Ti superconductors in the form of Rutherford cable, and super-ferric magnets with Nb-Ti coils. These magnets are in the design and prototyping phase, being developed internationally by the US LHC Accelerator Research Program (US-LARP), the CIEMAT Research Centre in Spain, CEA Saclay in France, the INFN-Milano LASA laboratories and INFN-Genova in Italy, and KEK in Japan.

In April, the winding and impregnation of the first coil of the superferric sextupole corrector was completed and successfully tested as a stand-alone coil in the INFN-LASA laboratories. The coil had a first quench at 80% of the short-sample limit – the maximum field achievable in the magnet – and reached 91% after three quenches at 2.5 K. In these correctors, the operational current is set at 60% of short-sample limit. This was the first test of a component of the HL-LHC interaction-region magnets, with an operational peak field in the coil of 2.3 T.

In May, the first coil for the Nb3Sn short quadrupole model, manufactured by the US-LARP collaboration, was tested in a mirror configuration at Fermilab. The coil had a first quench at 70% of the short-sample limit, a second one at 76%, and reached 90% after 20 quenches. The triplet will operate at 75%, with a peak field of 11.5 T – a value that has been recently reduced from the original 80% to add some margin, following the advice of a review committee held at CERN in December.

Since the beginning of the year, coil-winding tests have been under way, both at KEK, for the 5.6-T Nb-Ti separation dipole (D1), and at Saclay, for the 115-T/m-gradient Nb-Ti quadrupole. An iteration of the design of the iron yoke was performed at KEK to guarantee a better alignment of the dipole field during assembly. At Saclay, the first tests have confirmed the correct geometry of the end spacers and of the coil components.

The next step is testing of the first Nb3Sn quadrupole short model this coming autumn. This is made up of two CERN coils, which have recently been shipped to the US, and two LARP coils. A test of the first corrector sextupole is foreseen in LASA at the end of the year, and a test of the first short model of the separation dipole will be carried out at KEK.

• Based on an article in acceleratingnews.web.cern.ch.

Korean Tier-1 link upgrades to 10 Gbps

On 21 May, the Korea Institute of Science & Technology Information–Global Science experimental Data hub Center (KISTI-GSDC) – the Korean Tier-1 site of the Worldwide LHC Computing Grid (WLCG) – completed the upgrade to 10 Gbps of the bandwidth of its optical-fibre link to CERN. The link is part of the LHC Optical Private Network (OPN) that is used for fast data replication from the Tier-0 at CERN to Tier-1 sites in the WLCG.

KISTI-GSDC was approved as a full Tier-1 site at the 24th WLCG Overview Board in November 2013, backed by the ALICE community’s appreciation of the effort to sustain the site’s reliability and the contribution to computing resources for the experiment. At the time, the bandwidth of the dedicated connection to CERN provided by KISTI-GSDC was below that required, but the road map for upgrading the bandwidth was accepted.

The original proposal was to provide the upgrade of the OPN link by October 2014. However, following an in-depth revision of the executive plan with the Ministry of Science, ICT and Future Planning – the funding agency – to find the most cost-effective way, the upgrade process did not start until the end of February this year. It was finally completed just before the scheduled start of the LHC’s Run 2 in May.

The OPN link between KISTI and CERN is composed of two sections: Daejeon–Chicago (operated by KISTI) and Chicago–Geneva (operated by SURFnet). An additional line to be switched on in case of any necessary intervention complements the link. The yearly budget is about CHF1.1 million.

STAR helps to pin down a key phenomenon in gold collisions

The STAR collaboration at the Brookhaven National Laboratory (BNL) has published new evidence indicative of a “chiral magnetic wave” rippling through the quark–gluon plasma created in high-energy gold–gold collisions at the Relativistic Heavy Ion Collider (RHIC).

Heavy-ion collisions at RHIC and the LHC involve many spectators – nucleons that are not involved in any direct collision. The charged spectators – protons – have an important influence because they can produce a magnetic field of some 1014 T. In principle, this can lead to a collective excitation in the hot dense matter produced, the chiral magnetic wave. It results from the separation both of electric charge and of chiral charge, that is, right or left “handedness”, but only in a chirally symmetric phase. The phenomenon is predicted to manifest itself as an electric quadrupole moment of the collision system, where the “poles” and “equator” of the system acquire, respectively, additional positive and negative particles. This in turn influences differently the elliptic flow of positive and negative particles, decreasing the former and increasing the latter.

To look for this effect, STAR measured the elliptic flow, v2, of π+ and π produced in gold–gold collisions at mid-rapidity, as a function of the event-by-event charge asymmetry, ACH, over a range of energies. The team found that v2 increased linearly with ACH for π, but decreased for π+. At the highest energy, √sNN = 200 GeV, the slope of the difference in v2 between the π+ and π as a function of ACH depends on the centrality of the collision in a manner consistent with calculations that incorporate the chiral magnetic wave. The team also found a similar result for energies down to √sNN = 27 GeV, with no obvious dependence on beam energy. The researchers note that none of the conventional models they have considered appear to explain the observations.

0 decay reveals an intriguing anomaly

At the Flavor Physics and CP violation (FPCP) conference in Nagoya, the LHCb collaboration presented a measurement of the rate of B0 → D*+τντ relative to the related decay B0→ D*+μνμ. The first measurement of any B → τX decay at a hadron collider, it also indicates a tantalizing anomaly.

In the Standard Model, the ratio of these two branching fractions differs from unity only as a result of effects related to the mass of the much heavier τ lepton. The ratio R(D*) = BR(B0 → D*+τντ)/BR(B0 → D*+μνμ) is therefore precisely calculable in the Standard Model as equal to 0.252±0.003.

Lepton universality dictates that the electroweak coupling strength of the electron, muon and tau are identical, with the three flavours distinguished only by their respective masses. So the observation of decays with differing rates to each lepton flavour, after accounting for mass effects, would be a clear sign of physics beyond the Standard Model. Owing to the large τ mass, the semitauonic B0 → D*+τντ decay rate is particularly sensitive to the charged Higgs bosons predicted by many extensions of the Standard Model. Previous measurements have consistently been above predictions, making new results hotly anticipated.

LHCb has analysed 3 fb–1 of data from Run 1 of the LHC to measure R(D*) using the τ → μνμντ decay, which allows both the semitauonic and semimuonic mode to be reconstructed in the same final state. The two decays are distinguished via a fit to the decay kinematics, reconstructed using the visible decay products and an approximation for the rest frame of the B (see figure). In addition to the B0 → D*+τντ and B0 → D*+μνμ decays, the D*+μX final state also receives large contributions from several background processes. The modelling of these backgrounds in LHCb is constrained using control samples in data, strongly controlling uncertainties due to theoretical models. The result presented of 0.336±0.027±0.033 is in close agreement with a result from BaBar in 2012, and is 2.1σ away from the Standard Model prediction.

Between the results from LHCb, BaBar and the Belle collaboration – which also presented updated results at the conference – a tantalizing picture is emerging in this channel. LHCb already has plans for complementary measurements in the decays B → D0τντ and Λ0b → Λ+c τντ with the LHC Run 1 data set, and data from Run 2 is expected to allow for exciting improvements.

LHCb observes top production in the forward region

Studies of the production of top quarks in the forward region at the LHC are potentially of great interest in terms of new physics. Not only does the process have an enhanced sensitivity to physics beyond the Standard Model (owing to sizable contributions from quark–antiquark and gluon–quark scattering), but measurements of the forward production of top-quark pairs (tt) can be used to constrain the gluon parton distribution function (PDF) at large momentum fraction. Reducing the uncertainty on this PDF will increase the precision of many Standard Model predictions, especially those that serve as backgrounds to searches for new high-mass particles.

Top quarks decay almost exclusively to a W boson and a b-quark jet. The LHCb collaboration has already made high-precision measurements of W-boson production, and recently demonstrated the ability to identify, or tag, jets originating from b and c quarks (LHCb 2015a). Now, the collaboration had combined these two abilities in a study of W-boson production in association with b and c jets (LHCb 2015b), using a subset of these data samples to observe top-quark production for the first time in the forward region (LHCb 2015c). The data show a large excess of events compared with the Standard Model’s W+b-jet prediction in the absence of top-quark production (see figure).

LHCb measured the top-quark production cross-sections in a reduced fiducial region chosen to enhance the relative top-quark content of the W+b-jet final state. Within this region, the inclusive top-quark production cross-sections, which include contributions from both tt and single-top production, are σ(top) [7 TeV] = 239±53(stat.)±38(syst.) fb and σ(top) [8 TeV] = 289±43(stat.)±46(syst.) fb. These values are in agreement with the Standard Model predictions of 180+51–41 (312+83–68) fb at 7(8) TeV obtained at next-to-leading order using MCFM, the Monte Carlo programme for femtobarn processes.

In the LHC’s Run 2, the higher beam energy should lead to a greatly increased cross-section and acceptance for top-quark production. This will allow LHCb to measure precisely both tt and single-top production, and so provide important constraints on the gluon PDF as well as potential signs for physics beyond the Standard Model.

On the trail of long-lived particles

When searching for new particles in ATLAS, it is often assumed that they will either decay to observable Standard Model particles at the centre of the detector, or escape undetected, in which case their presence can be inferred by measuring an imbalance of the total transverse momentum. This assumption was a guiding principle in designing the layout of the ATLAS detector.

However, another possibility exists: what if new particles are long lived? Many models of new physics include heavy particles with lifetimes large enough to allow them to travel measurable distances before decaying. Heavy particles typically decay quickly into lighter particles, unless the decay is suppressed by some mechanism. Suppression could occur if couplings are small, if the decaying particle is only slightly heavier than the only possible decay products, or if the decay is mediated by very heavy virtual exchange particles. Looking for signatures of these models in the LHC data implies exploiting the ATLAS detector in ways it was not necessarily designed for.

These models can give rise to a broad range of possible signatures, depending on the lifetime, charge, velocity and decay channels of the long-lived particle. Decays to charged particles within the ATLAS detector volume can be detected as “displaced vertices”. Heavy charged particles that traverse the detector will move more slowly than their Standard Model counterparts, and will leave a trail of large ionization-energy deposits. Particles with very long lifetimes could even stop in the dense material of the calorimeter and decay at a later time. The ATLAS collaboration has performed dedicated searches to explore all of these spectacular – and challenging – signatures.

Standard reconstruction algorithms are not optimal for such unconventional signatures, so the ATLAS collaboration has used detailed knowledge of the experiment’s sub-detectors to develop dedicated algorithms; for example, to reconstruct charged-particle tracks from displaced decays or to measure the ionization-charge deposited by long-lived charged particles. A class of specialized triggers for picking up these signatures has also been designed and deployed.

These searches generally have very low background, but it is nevertheless essential to estimate the level because some of the signatures could be faked by instrumental effects that are not well-modelled in the simulation. Sophisticated data-driven background estimation techniques have therefore been developed.

One postulated type of long-lived particle is the “R hadron” – a supersymmetric particle with colour-charge combined with Standard Model quarks and gluons. Several ATLAS searches are sensitive to R hadrons, and between them they cover a wide range of lifetimes, as the figure (top right) shows (ATLAS Collaboration 2013 and 2015a). Other analyses have searched for a long-lived hidden-sector pion (“v pion”) by looking for displaced vertices in different ATLAS sub-detectors (ATLAS Collaboration 2015b and 2015c). Exotic Higgs-boson decays to long-lived neutral particles that decay to jets were constrained to a branching ratio smaller than 1% at the 95% confidence level, for a range of lifetime values, as in the figure (right).

With 13-TeV collisions under way at the LHC, the probability of producing heavy new particles has increased enormously, revitalizing the searches for new physics. ATLAS experimentalists are rising to the challenge of exploring as many new physics signatures as possible, including those related to long-lived particles.

bright-rec iop pub iop-science physcis connect