Comsol -leaderboard other pages

Topics

ATLAS in 2012: building on success

Event display

There was a keen sense of anticipation and excitement throughout the ATLAS collaboration as 2012 dawned. The LHC had performed superbly over the previous two years, delivering 5 fb–1 of proton–proton collision data at a centre-of-mass energy of 7 TeV in 2011, thereby allowing ATLAS to embark on a thorough exploration of a new energy regime. This work culminated with the first hints of a potential Higgs-like particle at a mass of about 126 GeV being reported by both the ATLAS and CMS collaborations at the CERN Council meeting in December 2011. With the promise of a much larger data sample at the increased collision energy of 8 TeV in 2012, everyone looked forward to seeing what the new data might bring.

The period leading up to the first collisions in early April 2012 saw intensive activity on the ATLAS detector itself, with the installation of additional sets of chambers to improve the coverage of the muon spectrometer, as well as the regular winter maintenance and consolidation work – essential for making sure that the detector was ready for the long year of data-taking ahead. With the promise of high-luminosity data with up to 40 simultaneous proton–proton collisions (“pile-up”) per bunch crossing – some 2–3 times more than seen in 2011 – experts from the groups responsible for the trigger, offline reconstruction and physics objects worked intensively to ensure that the online and offline software and selections were ready to cope with the influx of data. Careful optimization ensured that the performance of selections for electrons, τ leptons and missing transverse momentum, for example, were made stable against high levels of pile-up, while still keeping within the limits of the computing resources and maintaining – or even exceeding – the efficiencies and purities obtained in the 2011 data.

Meanwhile, the physics-analysis teams worked to finalize their analyses of the 2011 data for presentation at the winter/spring conferences and subsequent publication, while at the same time preparing for analysis of the new data. Members of the Higgs group focused attention on the two high mass-resolution channels H→γγ and H→ZZ(*)→4 leptons (figure 1), where the Higgs signal would appear as a narrow peak above a smoothly varying background. These channels had shown hints in the 2011 data and had the greatest potential to deliver early results in 2012. Using data samples from 2011 and a Monte Carlo simulation of the anticipated new data at 8 TeV, the analyses were re-optimized to maximize sensitivity in the mass region of 120–130 GeV, taking full advantage of the new object-reconstruction algorithms and selections.

The race to Australia

Once data-taking began in early April, the first priority was to calibrate and verify the performance of the detector, trigger and reconstruction, comparing the results with the new 8 TeV Monte Carlo simulation. The modelling of pile-up was particularly important and was checked using a dedicated low-luminosity run of the LHC, where events were recorded with only a single interaction per bunch crossing. Having established the basic conditions for physics analysis, attention then turned to preparations for the International Conference on High-Energy Physics (ICHEP) taking place on 5–11 July in Melbourne, where the particle-physics community and the world’s media would be eagerly awaiting the latest results from the new data.

As ICHEP drew nearer, the LHC began to deliver the goods, with up to 1 fb–1 of data per week

As ICHEP drew nearer, the LHC began to deliver the goods, with up to 1 fb–1 of data per week. Each new run was recorded, calibrated and processed through the Tier-0 centre of the Worldwide LHC Computing Grid at CERN, before being thoroughly checked and validated by the ATLAS data-quality group and delivered to the physics-analysis teams on a regular weekly schedule. At the same time, the worldwide computing Grid resources available to ATLAS worked round the clock to prepare the corresponding Monte Carlo simulation samples at the new collision energy of 8 TeV. At first, the analysers in the Higgs group restricted their attention to control regions in data, aiming to prove to themselves and the rest of the collaboration that the new data were thoroughly understood. After a series of review meetings, with a few weeks remaining before ICHEP, the go-ahead was given to “un-blind” the data taken so far – a moment of great excitement and not a little anxiety.

At first only hints were visible but as more data were added week by week and combined with the results from an improved analysis of the 2011 data, it rapidly became clear that there was a significant signal in both the γγ and 4-lepton channels. The last few weeks before ICHEP were particularly intense, with exhaustive cross-checks of the results and many discussions on exactly how to present and interpret what was being seen. With the full 5.8 fb–1 sample from LHC data-taking up until 18 June included, ATLAS had signals with significances of 4.5σ in the γγ channel and 3.4σ in 4 leptons, leading to the reporting of the observation of a new particle with a combined significance of 5.0σ at the special seminar at CERN on 4 July and at the ICHEP conference.

Similar signals were seen by CMS and both collaborations submitted papers reporting the discovery of this new Higgs-like resonance at the end of July. As well as the γγ and 4-lepton results reported at ICHEP, the paper by ATLAS also included the analysis of the H→WW(*)→lνlν channel, which revealed a broad excess with a significance of 2.8σ around 125 GeV. The combination of these three channels together with the 2011 data analysis from several other channels established the existence of this new particle at the 5.9σ level (figure 2), ushering in a new era in particle physics.

Searching for the unexpected

As well as following up on the hints of the Higgs seen in the 2011 data, the ATLAS collaboration has continued to conduct intensive searches across the full range of physics scenarios beyond the Standard Model, including those that involve supersymmetry (SUSY) and non-SUSY extensions of the Standard Model. More than 20 papers have been published or submitted on SUSY searches with the complete 2011 data set, with a similar number published on other searches beyond the Standard Model. One particular highlight is the search for the dark matter that is postulated to exist from astronomical observations but which has never been seen in the laboratory. By searching for “unbalanced” events, in which a single photon or jet of particles is produced recoiling against a pair of “invisible” undetected particles, limits can be set on the interaction cross-sections of the dark-matter candidates known as weakly interacting massive particles (WIMPs) with ordinary matter. Using the full 2011 data set, ATLAS was able to set limits on such WIMP-nucleon cross-sections for WIMPs of mass up to around 1 TeV; these limits are complementary to those achieved by direct-detection and gamma-ray observation experiments.

Another highlight is the search for new particles that decay into pairs of top (t) and antitop (t) quarks, giving rise to resonances in the tt– invariant mass spectrum. The complete 2011 data set gives access to invariant masses well beyond 1 TeV, where the t and t tend to decay in “boosted” topologies with two sets of back-to-back collimated decay products. By reconstructing each top decay as a single “fat” jet and exploiting recently developed techniques to search for distinct objects within the “substructure” of these jets, ATLAS was able to set limits on the production of resonances from the decay of Z’ bosons or Kaluza-Klein gluons in the tera-electron-volt range, even though high levels of pile-up added noise to the jet substructure. Such techniques will become even more important in extending these searches to higher masses with the full 2012 data sample.

ATLAS discovery paper

The search for SUSY continued apace in 2012, with new results from 8 TeV data presented at both the SUSY 2012 conference in August and the Hadron Collider Physics Symposium in November. By looking for events with several jets and large missing transverse energy, limits on the strong production of squarks and gluinos were pushed beyond 1.5 TeV for equal-mass squarks and gluinos in the framework of minimal supergravity grand unification (mSUGRA) and the constrained minimal supersymmetric extension of the Standard Model (CMSSM). The lack of evidence for “generic” SUSY signatures with masses close to the electroweak and top-quark mass scales – together with the discovery of a light Higgs-like object around 126 GeV – has led to much theoretical interest in scenarios where only the third generation of SUSY particles (top and bottom squarks, stau lepton) are relatively light. ATLAS performed a series of dedicated searches for the direct production of bottom and top squarks. The latter in particular give rise to final states that are similar to top-pair production, so searches become particularly challenging if the masses of the top squark and quark are similar. Data from 2012 were used to fill much of the “gap” around the mass of the top quark (figure 3).

Precision measurements

The ATLAS search programme described above relies on a thorough understanding of the Standard Model physics-processes that form the background to any search, but are also interesting to study in their own right. Fully exploiting the large statistics of the 2011 and 2012 data samples requires an understanding of the efficiencies, energy scales and resolutions for physics objects such as electrons, muons, τ leptons, jets and b-jets to the level of a few per cent or better, which in turn requires a dedicated effort that continued throughout 2012. This effort paid off in a large number of precise measurements involving the production of combinations of W and Z bosons, photons and jets, including those with heavy flavour. In many cases, these results challenge the current precision of QCD-based Monte Carlo calculations and provide important input for improving the ability to describe physics at LHC energy scales. Studies of high-rate jet production and soft QCD processes have also continued, with measurements of event shapes, energy flow and the underlying event contributing to knowledge of the backgrounds that underlie all physics processes at the LHC. The measurements of WW, WZ, ZZ, Wγ and Zγ production have allowed stringent constraints to be placed on anomalous couplings of these bosons at high energies, in addition to being an essential ingredient in understanding the backgrounds to Higgs searches.

The large top-quark samples available in the data from 2011 and now 2012 have opened up a new era in the study of the heaviest known fundamental particle. The cross-sections for the production of both tt– pairs and single top quarks have been measured precisely at both 7 TeV and 8 TeV; evidence for the associated production of a W boson and a top quark has also been observed. Limits have been set on the associated production of tt pairs together with W and Z particles, and even Higgs bosons, and these studies will be extended with the full 2012 data set. The asymmetry in tt production has also been measured with the full 7 TeV data set – although, unlike at the Tevatron at Fermilab, no hints of anomalies have been seen. The polarizations of top quarks and W bosons produced in their decays have been measured and spin correlations between decaying t and t quarks observed. Furthermore, ATLAS has begun to characterize the top-quark production processes in detail, looking at kinematic distributions and the production of associated jets – key ingredients in increasing the precision of top-quark measurements, as well as in evaluating top-quark backgrounds in searches for physics beyond the Standard Model.

Searches by ATLAS

In addition, ATLAS has continued to exploit the large samples of B hadrons produced at the LHC, in particular those from dimuon final states, which can be recorded even at the highest LHC luminosities. Highlights include the detailed study of CP violation in the decay Bs→J/ψφ, which was found to be in perfect agreement with the expectation from the Standard Model, and the precise measurement of the Λb mass and lifetime.

In late 2011, ATLAS recorded around 20 times more lead–lead collisions than in 2010, allowing the studies of the hot, dense medium produced in such collisions to be expanded to include photons and Z bosons, as well as jets. A new technique was developed to subtract the “underlying event” background in lead–lead collisions, enabling precise measurements of jet energies and the identification of electrons and photons in the electromagnetic calorimeter. Bosons emerge from the nuclear collision region “unscathed”, opening the door to using the energy balance in photon-jet and Z-jet events to study the energy loss suffered by jets. In addition, ATLAS has pursued a broad heavy-ion physics programme, which includes the study of correlations and flow, charged-particle multiplicities and suppression, as well as heavy-flavour production. The collaboration looks forward eagerly to the proton–lead physics run scheduled for early 2013.

What is next?

At the time of writing, ATLAS is on track to record more than 20 fb–1 of proton–proton collision data in 2012 and studies of these data by the various teams are in full swing across the whole range of search and measurement analysis. Building on the discovery announced in July, the next task for the Higgs analysis group is to learn more about the new particle, comparing its properties with those expected for the Standard Model Higgs boson and various alternatives. A first step was presented in September, where the July analyses were interpreted in terms of limits on the coupling strength of the new particle to gauge bosons, leptons and quarks, albeit with limited precision at this stage. It is also important to see if the particle decays directly to fermions, by searching for the decays H→ττ and H→bb.

These analyses are extremely challenging because of the high backgrounds and low invariant-mass resolution but first results using 13 fb–1 of 8 TeV data were presented at the Hadron Collider Physics Symposium in November. These results are not yet conclusive; the full 2012 data sample is needed to make any definite statements. At that point, it should also be possible to probe the spin and CP-properties of the new particle and improve the precision on the couplings, bringing the picture of this fascinating new object into sharper focus. At the same time, first results from searches beyond the Standard Model with the complete 2012 data set should be available, further increasing the sensitivity across the full spectrum of new physics models. The analysis of this data set will continue throughout the 2013–2014 shutdown, setting the stage for the start of the 13–14 TeV LHC physics programme in 2015 with an upgraded ATLAS detector.

• This article has only scratched the surface of the ATLAS physics programme in 2012. For more details of the more than 200 papers and 400 preliminary results, please see https://twiki.cern.ch/twiki/bin/view/AtlasPublic.

Quarks on the menu in Munich

CCcon1_01_13

Some 400 theorists and experimentalists from all around the world convened in Munich on 8–12 October to discuss developments in the theory of strong interactions. They were attending the tenth conference on “Quark Confinement and the Hadron Spectrum” (ConfX) at the Garching Research Campus, hosted by the Physics Department of the Technical University of Munich (TUM), with support from the Excellence Cluster “Origin and Structure of the Universe”. Topics included areas at the boundaries of the field, such as theories beyond the Standard Model with a strongly coupled sector and QCD approaches to nuclear physics and astrophysics.

Inaugurated in 1994 in Como, Italy, this series of conferences has established itself as an important forum in the field, bringing together people working in strong interactions on approaches that range from lattice QCD to perturbative QCD, models of the QCD vacuum to phenomenology and experiments, the mechanism of confinement to deconfinement and heavy-ion physics, and from effective field theories to physics beyond the Standard Model. Taking place at a particularly important time for particle physics, with the observation of a Higgs-like particle at CERN, the tenth conference provided a valuable opportunity not only to reconsider what was done on past occasions but also to discuss the perspectives for strongly coupled theories.

The scientific focus of ConfX was spread across seven main scientific sessions: vacuum structure and confinement; light quarks; heavy quarks; deconfinement; QCD and new physics; nuclear and astroparticle physics; and strongly coupled theories. These subjects are relevant for the physics of B factories (Belle and BaBar), tau-charm experiments (BESIII), LHC experiments (LHCb, CMS, ATLAS), heavy-ion experiments (RHIC, ALICE), future experiments at FAIR-GSI (Panda, CMB) and in general for many low-energy experiments (such as at Jefferson Lab, COSY, MAMI) and some parts of experimental astrophysics.

It is impossible to summarize here the wealth of results presented at the meeting, the intensity of the discussions and the flow of information. What follows is just a brief selection.

CCcon2_01_13

The first plenary session began with recent progress in the theoretical calculations of double parton-scattering at the LHC presented by Aneesh Manohar of the University of California, San Diego. The application of soft collinear effective theory to many collider physics processes was then introduced by Thomas Becher of Bern University and followed by a review of quarkonium production by Kuang-Ta Chao of Peking University. In particular, J/ψ production has now finally been calculated at next-to-leading order in nonrelativistic QCD (NRQCD) and the extraction of colour-octet matrix elements from a combined fit to collider data has become possible for the first time. The current picture hints at the universality of the NRQCD matrix elements and a proof of the NRQCD factorization in the fragmentation approach seems to be close. Predictions for the production of Υ and other quarkonia states at the LHC experiments are now available. The progress in theory together with the new LHC data should soon allow the resolution of the long-standing puzzles about the J/ψ polarization and the production mechanism of quarkonium, both at hadron colliders and at B factories.

Heavy ions and more

The study of quarkonium production and suppression at finite temperature in heavy-ion collisions as a probe of quark–gluon plasma was reviewed in the context of a new effective field-theory approach (potential NRQCD at finite temperature). Here the shift in paradigm from the typical phenomenological description is apparent, the quarkonium dissociation being caused by the emergence of a large imaginary part in the quark–antiquark potential rather than by a Debye screening phenomenon as reported by Jacopo Ghiglieri of McGill University. The effective field-theory approach allows a systematic calculation of the thermal modifications in the energy and width of the Υ(1S) as produced at the LHC in heavy-ion collisions.

There has been great progress in developing the capabilities of the lattice approach to calculate the properties of heavy and light quarks, and also in connection to chiral effective field theories, as Peter Lepage of Cornell University, Laurent Lellouch of the Centre de Physique Théorique, Marseilles, and Zoltan Fodor of the University of Wuppertal reported.

The interest and relevance of light scalars, as well as the long-standing controversy dating back to the 1950s about their existence and nature, has been resolved in recent years by means of better data and more powerful theoretical techniques that include effective Lagrangians and dispersion theory, as José Pelaez of the Complutense University of Madrid argued.

CCcon3_01_13

Highlights in strong physics beyond the Standard Model presented at the conference include: composite dynamics as put in context by Francesco Sannino of the Centre for Cosmology and Particle Physics Phenomenology, Odense, at the time of the Higgs discovery; gauge gravity duality; holographic QCD explained by Shigeki Sugimoto from Tokyo University; and applications of anti-deSitter/conformal field theory correspondence to heavy-ion collisions contrasted to proton–proton physics at the LHC now and in the future, including the outstanding LHC results, presented by Günther Dissertori of ETH Zurich. This session culminated in a heated discussion about future strongly coupled scenarios, led by Antonio Pich of Valencia University, in which different views of scenarios beyond the Standard Model were discussed but remained unreconciled among the panel members Estia Eichten of Fermilab, Emanuel Katz of Boston University, Juan Maldacena of the Institute of Advanced Study, Princeton, and Stefan Pokorski of the University of Warsaw.

The conference featured a total of 250 talks

The plenary session on Wednesday morning was dedicated to the impact of QCD on nuclear and astroparticle physics. Opening the session, Ulrich Wiedner of Ruhr University Bochum presented a comprehensive review of the highlights and future of low-energy experiments in hadron physics. An effective field theory and lattice description of a variety of nuclear bound states and reactions, as well as a review of the low-energy interaction of strange and charm hadrons with nucleons and nuclei, were presented by Evgeny Epelbaum, also of Bochum, and William Detmold at Massachusetts Institute of Technology. Charles Horowitz of Indiana University spoke about multimessenger observations of neutron-rich matter, describing the Lead Radius Experiment (PREX) at Jefferson Lab, which measures the neutron density of 208Pb using parity-violating electron scattering. This has important implications for neutron-rich matter and neutron stars. He also described X-ray observations of radii of neutron stars, which are possibly model dependent, and their implications for the equation of state. Gravitational-wave observations of merging neutron stars and r-mode oscillations were discussed in terms of the equation of state, mechanical properties and bulk and shear viscosities of neutron-rich matter. This prepared the ground for the roundtable discussion on “What can compact stars really tell us about dense QCD matter”, chaired by Andreas Schmitt of the Vienna University of Technology.

On Thursday morning, Pich gave an overview of the perturbative determination of αs in which he presented the final value of 0.1187 ± 0.0007 and discussed the impact of the different type of αs extractions on the final result.

A number of low-energy precision measurements are sensitive to new physics either because the Standard Model prediction for the measured quantity is precisely known – for example, the anomalous magnetic moment of the muon (g-2) – or because the Standard Model “background” is small, as in the case of electric dipole moments (EDMs). Timothy Chupp of the University of Michigan presented several studies that are under way to probe physics beyond the Standard Model, including g-2 and EDMs. He also described the prospects for the precision measurement of the Cabibbo-Kobayashi-Maskawa matrix element, Vud, from neutron decay, i.e. the neutron lifetime and measurement of the axial-vector coupling constant (gA), as well as couplings beyond the Standard Model accessible from neutron decay. The discussion culminated in the roundtable “Resolving physics beyond the Standard Model at low energy” led by Susan Gardner of the University of Kentucky.

The final plenary session on Friday afternoon started with a talk by Mikko Laine of the University of Bern, in which he drew analogies and relationships between hot QCD and cosmology. John Harris of Yale University went on to review the latest heavy-ion data from Brookhaven’s Relativistic Heavy-Ion Collider (RHIC) and the LHC. In particular, the data show how the “soup” of quark–gluon plasma flows easily, with extremely low viscosity – suggesting a near-perfect liquid of quarks and gluons. However, it appears opaque to energetic partons at RHIC and less so to the extremely energetic parton probes available in collisions at the LHC. This review was followed by presentations on the theoretical challenges and perspectives in the exploration of the hot QCD matter, including recent highlights in lattice calculations at finite temperature and finite density as presented by Peter Petreczky of Brookhaven National Laboratory. The session culminated with a roundtable about “Quark Gluon Plasma: what is it and how do we find it out?” chaired by Berndt Mueller of Duke University.

Yiota Foka of GSI and CERN reported on the International Particle Physics Outreach Group, which has developed an educational activity that brings LHC data into the classroom. Each year since 2005, thousands of high-school students in many countries go to nearby universities or research centres for one day to unravel the mysteries of particle physics and to be “scientists for a day”. In 2012, 10,000 students from 130 institutions in 31 countries took part in the popular event over a four-week period.

The conference featured a plenary session and seven sessions running in parallel on the subjects of the seven topical sections, with a total of 250 parallel talks. The sections on vacuum structure and confinement and on deconfinement constituted almost two conferences in themselves, with a total of 54 talks in 17.5 hours and 57 talks in 24 hours, respectively. The conference as a whole ended with a visionary talk by Chris Quigg of Fermilab on “Beyond Confinement”. The extraordinary scientific discussion and exchange that characterized the conference has served as a trigger for a document “Strongly Coupled Physics: challenges, scenarios and perspectives” that is currently in preparation in collaboration with the section conveners.

During the poster session, participants could also enjoy tasting cheese and a variety of wine from all of the countries represented. A ride down the gigantic slide belonging to the Mathematics Department complemented the lively scientific discussions. An evening session on the “Colourful world of quark and gluons” given by Gerhard Ecker, “The shaping of QCD”, and Thomas Mannel, “The many facets of QCD”, attracted the public from Garching city and from the many campus research institutes, as well as conference participants. Tours of Munich, glimpses of Bavarian culture at the famous Hofbräuhaus and a social dinner at the Hofbräukeller complemented the opportunity to discover the local campus facilities (the TUM Institute of Advanced studies and the TUM engineering, mathematics and physics departments).

A watershed: the emergence of QCD

David Gross and Frank Wilczek

In a recent article, Harald Fritzsch shared his perspective on the history of the understanding of the strong interaction (CERN Courier October 2012 p21). Here, we’d like to supplement that view. Our focus is narrower but also sharper. We will discuss a brief but dramatic period during 1973–1974, when the modern theory of the strong interaction – quantum chromodynamics, or QCD – emerged, essentially in its current form. While we were active participants in that drama, we have not relied solely on memory but have carefully reviewed the contemporary literature.

At the end of 1972 there was no fundamental theory of the strong interaction – and no consensus on how to construct one. Proposals based on S-matrix philosophy, dual-resonance models, phenomenological quark models, current algebras, ideas about “partons” and chiral dynamics – the logical descendant of Hideki Yukawa’s original pion-exchange idea – created a voluminous and rapidly growing literature. None of those competing ideas, however, offered a framework in which uniquely defined calculations leading to sharp, testable predictions could be carried out. It seemed possible that strong-interaction physics would evolve along the lines of nuclear physics: one would gradually accumulate insight experimentally, and acquire command of an ever-larger range of phenomena through models and rules of thumb. An overarching theory worthy to stand beside Maxwell’s electrodynamics or Einstein’s general relativity was no more than a dream – and not a widely shared one.

Within less than two years the situation had transformed radically. We had arrived at a very specific candidate theory of the strong interaction, one based on precise, beautiful equations. And we had specific, quantitative proposals for testing it. The theoretical works [1–5] that were central to this transformation can be identified, we think, with considerable precision.

First clues

Let us briefly recall the key lines of evidence and thought that those works reconciled, synthesized and brought to fruition. They can be summarized under three headings: quarks and colour; scaling and partons; quantum field theory and the renormalization group.

Quarks and colour: A large body of strong-interaction phenomenology, including the particle spectrum and magnetic moments, had been organized using the idea that mesons and baryons are composite particles made from combinations of a small number of more fundamental constituents: quarks. This approach, which had its roots in the ideas of Murray Gell-Mann [6] and George Zweig [7], is reviewed in a nice book by J J J Kokkedee [8]. For the model to work, the quarks were required to have bizarre properties – qualitatively different from the properties of any known particles. Their electric charges had to be fractional. They had to have an extra internal “colour” degree of freedom [9,10]. Above all, they had to be confined. Extensive experimental searches for individual quarks gave negative results. Within the model quark–antiquark pairs made mesons, while quark–quark–quark triplets made baryons; single quarks had to be much heavier than mesons and baryons – if, indeed, they existed at all.

Scaling and partons: The famous electroproduction experiments at SLAC revealed, beginning in the late 1960s, that inclusive cross-sections did not exhibit the “soft” or “form factor” behaviour familiar in exclusive and purely hadronic processes (as explored up to that time). Richard Feynman [11] interpreted these experiments as indicating the existence of more fundamental point-like constituent particles within protons, which he called partons. His approach was intuitive, employing a form of impulse approximation. James Bjorken [12] arrived at related results earlier, using more formal operator methods (local current algebra). Current-algebra sum rules were derived using “quark–gluon” models with Abelian, flavourless gluons. The agreement of these sum rules with experimental results on electron and neutrino deep-inelastic scattering gave strong evidence that charged partons are spin 1/2 particles [13] and that they have baryon number 1/3 [14], i.e. that charged partons are quarks.

Quantum field theory and the renormalization group: Martinus Veltman and Gerardus ’t Hooft [15] brought powerful new tools to the study of perturbative renormalization theory, leading to a more rigorous, quantitative formulation of gauge theories of electroweak interactions. Kenneth Wilson introduced a wealth of new ideas, conveniently though rather obscurely referred to as the renormalization group, into the study of quantum field theory beyond the limits of perturbation theory. He used these ideas with great success to study critical phenomena. Neither of those developments related directly to the strong interaction problem but they formed an important intellectual background and inspiration. They showed that the possibilities for quantum field theory to describe physical behaviour were considerably richer than previously appreciated. Wilson [16] also sketched how his renormalization-group ideas might be used to study short-distance behaviour, with specific reference to problems in the strong interaction.

These various clues appeared to be mutually exclusive, or at least in considerable tension. The parton model is based on neglect of interference terms whose existence, however, is required by basic principles of quantum mechanics. Attempts to identify partons with dynamical quarks [17] were partially successful but ascribed a much more intricate structure to protons than was postulated in the simplistic quark models and unambiguously required additional, non-quark constituents. The confinement of quarks contradicted all previous experience in phenomenology. Furthermore, such behaviour could not be obtained within perturbative quantum field theory. There were numerous technical challenges in combining re-scaling transformations, as used in the renormalization group, with gauge symmetry.

But the most concrete, quantitative tension, and the one whose resolution ultimately broke the whole subject open, was the tension between the scaling behaviour observed experimentally at SLAC and the basic principles of quantum field theory. Several workers [18] expanded Wilson’s somewhat sketchy indications into a precise mapping between calculable properties of quantum field theories and observable aspects of inclusive cross-sections. Specifically, this work made it clear that the scaling behaviour observed at SLAC could be obtained only in quantum field theories with very small anomalous dimensions. (Strict scaling, which is equivalent to vanishing anomalous dimensions, cannot occur in a non-trivial – interacting – quantum field theory [19].) A few realized that approximate scaling could be achieved in an interacting quantum theory, if the effective interaction approached zero at short distances. Anthony Zee called such field theories “stagnant”(they are essentially what we now call asymptotically free theories) and he [20], Kurt Symanzik [21] and Giorgio Parisi [22] searched for such theories. However, none found any physically acceptable examples. Indeed, a powerful no-go result [23] demonstrated that no four-dimensional quantum field theory lacking non-Abelian gauge symmetry can be asymptotically free.

The tension between scaling and quantum field theory might be resolved but only within a special, limited class of theories

Our paper, submitted in April 1973 [1], alludes directly to these motivating issues in its opening: “Non-Abelian theories have received much attention recently as a means of constructing unified and renormalizable theories of the weak and electromagnetic interactions. In this note we report an investigation of the ultraviolet (UV) asymptotic behaviour of such theories. We have found that they possess the remarkable feature, perhaps unique among renormalizable theories, of asymptotically approaching free-field theory. Such asymptotically free theories will exhibit, for matrix elements between on-mass-shell states, Bjorken scaling. We therefore suggest that one should look to a non-Abelian gauge theory of the strong interactions to provide the explanation for Bjorken scaling, which has so far eluded field-theoretic understanding.”

Thus the tension between scaling and quantum field theory might be resolved but only within a special, limited class of theories. The paper surveys those possibilities and concludes: “One particularly appealing model is based on three triplets of fermions, with Gell-Mann’s SU(3)xSU(3) as a global symmetry and an SU(3) “colour” gauge group to provide the strong interactions. That is, the generators of the strong-interaction gauge group commute with ordinary SU(3)xSU(3) currents and mix quarks with the same isospin and hypercharge but different “colour”. In such a model the vector mesons are neutral and the structure of the operator product expansion of electromagnetic or weak currents is (assuming the strong coupling constant is in the domain of attraction of the origin!) essentially that of the free quark model (up to calculable logarithmic corrections).*” This was the first clear formulation of the theory that we know today as QCD. The footnote indicated by * refers to additional work, which became the core of our two subsequent papers [3, 4].

David Politzer’s paper [2] contains calculations of the renormalization group coefficients for non-Abelian gauge theories with fermions, broadly along the same lines as in our first paper quoted above [1]. It does not refer to the problem of understanding scaling in the hadronic strong interaction. (The reference to “strong interactions” in the title is generic.) Politzer emphasized the importance of the converse of asymptotic freedom – that is, that the effective coupling grows at long distances. He remarks that this could lead to surprises regarding the particle content of asymptotically free theories and support dynamical symmetry breaking. Although we arrived at our results independently, we and Politzer learnt of each other’s work before publication, compared results, requested simultaneous publication and referred to one another. The paper by Howard Georgi and Politzer [5] adopts QCD without comment and independently derives predictions for deviations from scaling parallel to the corresponding parts of our papers [3, 4].

Further reflections

The preceding account omits several sidelights and near misses, and lots of prehistory. But, although it is incomplete, we do not think it is distorted.

It may be appropriate to mention explicitly contributions by two extremely eminent physicists (with collaborators) that are often cited together with papers 1–5 in ways that can be misleading.

’t Hooft, together with Veltman, had developed effective methods for calculating quantum corrections in non-Abelian gauge theories. They had worked out many examples, specifically including one-loop wave function and vertex divergences [24]. It would not have been very difficult, as a technical matter, to re-assemble pieces of those calculations to construct calculations of renormalization group coefficients. ’t Hooft attests – and Symanzik corroborated – that he announced a negative value of the β function for non-Abelian gauge theories with fermions at a conference in Marseilles in the summer of 1972. Unfortunately, there is no record of this in the workshop proceedings, nor in the contemporary literature, so there is no documentation regarding the exact content of the announcement or its context. It had no influence on papers 1–5. In his contemporary work on the strong interaction, ’t Hooft adopted a completely different perspective from that of Gross-Wilczek and Georgi-Politzer, a perspective from which it would be very difficult to arrive at QCD and its property of asymptotic freedom as we understand them today. Specifically, ’t Hooft’s work considered a spontaneously broken gauge theory with hadrons as the fundamental objects, e.g. ρ mesons as gauge particles. His relevant publications immediately following papers 1–5 supply alternative methods for calculating renormalization group coefficients but do not propose specific physical applications.

Two contributions involving Gell-Mann and collaborators are sometimes cited as sources of QCD. The first is the “Rochester Conference” at Fermilab in the summer of 1972 [25]. It contains two relevant presentations, Gell-Mann’s summary talk and a contributed paper with Fritzsch, entitled “Current Algebra: Quarks and What Else?” In the summary talk, SLAC scaling is mentioned and interpreted in terms of “quarks, treated formally”. The discussion is not rooted in quantum field theory; indeed, most of the discussion of the strong interaction, by far, is given over to S-matrix and dual-resonance ideas. The presentation with Fritzsch briefly mentions the possibility of using colour octet gluons, as one among several possibilities for extending light-cone current algebra (again, not within a quantum field theory).

The second contribution [26] appeared after 1–5 and refers to them. From a historical perspective, what is particularly revealing about it is the comment: “For us, the result that the colour octet field theory model comes closer to asymptotic scaling than the colour singlet model is interesting, but not necessarily conclusive, since we conjecture that there may be a modification at high frequencies that produces true asymptotic scaling.”

As events unfolded, the most profound and most fruitful aspects of QCD and asymptotic freedom proved to be their embodiment in a rigorously defined, quantitatively precise quantum field theory, which could be tested through its prediction of deviations from scaling. Yet just those aspects are what the authors hesitated to accept, even after they had been analysed.

The emergence of a specific, precise quantum field theory for the strong interaction – featuring beautiful equations – marked a watershed. Remarkable progress ensued on several fronts.

The realization that basic strong interaction processes at high energy could be calculated in a practical, controlled and systematically improvable way opened up many applications (figure 1). The subject now called perturbative QCD, which refines and improves parton model ideas, is a direct outgrowth of papers 1–5 but extends their scope almost beyond recognition. Perturbative QCD is the subject of several large textbooks, dozens of conference proceedings, etc. It has become the essential foundation for analysing experimental results from high-energy accelerators including, notably, the LHC. It justifies, in particular, the identification of “jets” with quarks and gluons (figure 2), and allows calculation of their production rates.

The paradoxical heuristics of the quark model, with its juxtaposition of free-particle properties with confinement, became physically plausible and matured into a well posed mathematical problem [4]. For the growth of the effective coupling with increasing distance, together with the existence of formally massless (colour) charged particles, brought the theory into uncharted territory. Because uncancelled field energy threatens to build up catastrophically, it was plausible that only singlet states might emerge with finite energy. Essentially new mathematical techniques were invented to address this challenge. The most successful of these, based on direct numerical solution of the equations (so-called “lattice gauge theory”) has gone far beyond demonstrating confinement to yield sharp quantitative results for the mass spectrum and for many detailed properties of hadrons.

The equations of QCD are rooted in the same mathematics of gauge symmetry

More generally, the dramatic success of a fully realized quantum field theory in yielding a wealth of striking physical phenomena that are not evident in a linear approximation – including emergence of a dynamical scale (“mass without mass”), dynamical symmetry breaking, a rich physical spectrum and, of course, confinement – helped catalyse a renewed interest in the deep possibilities of quantum field theory. It continues to surprise us today.

Prior to papers 1–5, the behaviour of matter at ultrahigh temperatures and densities seemed utterly inaccessible to theoretical understanding. After these papers, it was understood instead to be remarkably simple. That circumstance opened up the earliest moments of the Big Bang to scientific analysis. It is the foundation of what has become a large and fruitful field: astroparticle physics.

The equations of QCD are rooted in the same mathematics of gauge symmetry [27] that underlies the modern theory of electroweak interactions. They are worthy to stand beside Maxwell’s equations; one might even say they are an enriched version of those equations. It becomes possible to contemplate still more extensive symmetries, unifying the different forces. The methods used to establish asymptotic freedom – specifically, running couplings – provide quantitative tools for exploring that idea. Intriguing, encouraging results have been obtained along these lines. They suggest, in particular, the possibility of low-energy supersymmetry, such as might be observed at the LHC.

ICFP 2012 opens up interdisciplinarity

CCicf1_01_13

The International Conference on New Frontiers in Physics (ICFP) aims to promote scientific exchange between different areas of fundamental physics, with particular emphasis on future plans and related open questions. The first in the new series, ICFP 2012, which took place in Kolymbari, Crete, attracted 140 participants from fields ranging from particle physics and cosmology to quantum physics and the foundations of quantum mechanics – a discipline awarded the 2012 Nobel Prize in Physics. The following highlights reflect the main themes of the plenary talks, which were further elaborated in many parallel sessions.

One of the last conferences to hear enticing hints of an imminent Higgs-boson discovery

ICFP 2012 was one of the last conferences to hear enticing hints of an imminent Higgs-boson discovery, as the ATLAS and CMS collaborations at the LHC presented candidate signals for the Higgs boson with a local significance of 2.5–2.8σ at a mass of 125–126 GeV. At the same time, the CDF and DØ collaborations from the Tevatron at Fermilab also reported an excess near the same mass region with a local significance of 2.7σ. In other presentations, state-of-the-art theoretical calculations of the cross-section for a Standard Model Higgs boson were described, as well as a prediction for the Higgs boson mass of 121–126 GeV and the supersymmetric spectrum from finite unified theories. Implications beyond the Standard Model of both the mass and the large diphoton rate observed were also discussed. Reports on experimental searches for new physics, such as excited leptons, heavy neutrinos, new bosons, supersymmetry and gravity signatures, went further beyond the Standard Model, as did discussions of string theory and extra dimensions. Results from the LHC on di-jets accompanying vector bosons excluded at 95% confidence level the structure that the CDF experiment saw two years ago.

Talks on hadrons and QCD covered the latest lattice QCD results and presented theoretical predictions and the status of new states with heavy quarks and exotic hadrons, such as the Zb states discovered in 2011 by the Belle experiment at KEK. The latter are consistent with a minimal content of two quarks and two antiquarks. Within a new extended quark model that has both quarks and diquarks as building blocks, new QCD effects and interpretations emerge; for example, there are no radial excitations in low-energy QCD and hadrons can shrink. Reflecting the interdisciplinary theme of the conference, one approach to the description of the QCD phase diagram that was discussed involves a holographic model; Lorentz violation and holography were also discussed.

Highlights from heavy-ion experiments confirm that the hot and dense medium created in heavy-ion collisions behaves like a strongly interacting, almost perfect liquid – the strongly interacting quark–gluon plasma. The estimates of shear viscosity are consistent with the lower bound of the anti-de Sitter/conformal field-theory correspondence. The generated flow seems to affect even heavy particles, while jets and hadrons with high-transverse momentum are strongly quenched traversing this medium. An analogy was made between the higher-order flow coefficients that originate from the initial fluctuations of the “Little Bang” in central heavy-ion collisions and the measurements of the cosmic microwave background radiation that explore the initial fluctuations of the early universe after the Big Bang. Outstanding results have come from measurements of quarkonia, such as the indication of sequential suppression of quarkonia and of possible J/ψ regeneration at the LHC. The direct Υ(1S) state is not suppressed either at Brookhaven’s Relativistic Heavy-Ion Collider (RHIC) or at the LHC, while charmonium and bottomonium states with smaller dissociation temperatures than the Υ(1S), show a suppression at both RHIC and the LHC – as expected for a deconfined plasma of quarks and gluons within a colour-screening scenario.

An overview described the status of rare decays and CP violation, while results on the latter from LHCb and other LHC experiments set strong constraints on models and led to intriguing results that await an explanation either inside or outside the Standard Model. In particular, the isospin asymmetry in B → K μ+μ differs from the expectation by 4σ, while CP violation in the charm sector shows a 3.5σ deviation from the CP-conserving hypothesis. Results from the BaBar experiment at SLAC highlight a significant excess of events in B → D*τ ν decays at 3.4σ above the Standard-Model expectation, thus ruling out the type II two-Higgs-doublet model. BaBar has also made a direct observation of time-reversal violation at the 14σ level. The CP violation seen by LHCb in D-meson decays could arise from a fourth generation of quarks and leptons.

In the neutrino sector, an overview described the status of experiments on neutrinoless double-beta decay and their expected reach. According to the “forecast” given, the claimed evidence of the signal reported in 2001 by a subset of the Heidelberg-Moscow collaboration will be checked by the GERDA experiment in the Gran Sasso National Laboratory in the near future. Currently, the EXO-200 experiment sets the most competitive limit in the field and almost completely rules out the claim. The OPERA collaboration reported on new oscillation results from the search for ντ appearance, preliminary limits on oscillation parameters from the search for νμ → νe and an update on the measurement of neutrino velocity. New results from the T2K experiment in Japan confirm the first evidence for νe appearance presented in 2011 and provide a measurement of sin213. In reactor experiments, the Double Chooz collaboration presented results on sin213 that exclude the non-oscillation scenario at 3.1σ, while the high-precision measurements of sin213 presented by the Daya Bay collaboration exclude a zero value for θ13 at more than 7σ.

The quest to dark matter

The quest to determine the nature of dark matter is a challenge at the boundary of particle physics and astrophysics. Possible hints, for example from the Fermi Gamma-ray Space Telescope and the PAMELA experiment in space, were discussed in an overview of experimental searches and theoretical implications and expectations. Other results included limits on compact halo objects as dark matter obtained from gravitational microlensing, as well as the status of the Alpha Magnetic Spectrometer (AMS-02), which has been in orbit since May 2011. The status and recent upgrades of the DAMA/LIBRA experiment and its observation at 8.9σ for a candidate signal for dark-matter particles in the galactic halo, through an annual modulation signature, were reviewed at the conference, together with detailed studies of background. Other talks covered primordial scalar perturbations via conformal mechanisms and the experimental status of the Dark Energy Survey.

At a more mathematical level, participants learnt how gravity can be viewed as emerging out of the differential calculus in non-commutative geometry, with effects that include a separation of the inertial and gravitational masses of a test particle as its mass approaches the Planck mass. Aspects of string cosmology included a review of bouncing string-cosmologies in which the Big Bang is no longer regarded as the beginning of time, as well as a presentation on how dilaton-field dominance in early epochs enlarges the cosmologically allowed parameter space for supersymmetry at the LHC.

Talks on quantum physics covered, for example, Aharonov’s two-state vector formalism, in which hidden variables may exist if the requirement of causality is relaxed to allow – under appropriate circumstances – the effects of future events on past measurements. Transaction and non-locality in quantum field theory and cosmological consequences of a de Sitter non-local vacuum, involving David Bohm’s “holomovement” ideas, were also discussed, providing a link between cosmology and quantum physics, as were classical and quantum information acquisition, measurement and the positive-operator valued measure. An overview of quantum physics with massive objects included among other topics, the possibility of testing the predictions of quantum gravity, as well as the experimental perspectives of atom–photon interactions.

The future of physics

At a broader level, an overview talk presented the European Physical Society and its activities. Moreover, looking forward to the future generations of physicists, a presentation on educational projects was given to high-school teachers in nearby Chania, the second-largest city on Crete.

Sessions during the last two days of the conference addressed the future plans of particle and nuclear physics. These included the status of the eRHIC electron–ion collider project at Brookhaven and the Nuclotron-based Ion Collider facility at JINR, as well as an overview and outlook on heavy-ion collisions at the LHC. There were also presentations on the status and plans of major particle-physics projects, namely the Muon Collider, the International Linear Collider, the Compact Linear Collider and Super B. In addition, CERN’s future plans were highlighted, as were the ideas and actions of the European Strategy for Particle Physics group and its update plan, which is currently under preparation. The conference closed with an overview of the activities of the European Committee for Future Accelerators.

To prepare not only the students but all of the audience for an interdisciplinary week, a day of lectures preceded the conference. Discussions during the sessions and more informally, then offered the possibility to explore interdisciplinary knowledge. Results from these interactions appear in the papers contributed to the conference proceedings, which will be peer reviewed and published in the EPJ Web of Conferences in 2013.

First results from proton–lead colliding beams

On 12 September, during a short, highly successful pilot run, the LHC operated with protons in one beam and lead ions in the other, so providing the LHC experiments with their first proton–nucleus collision data and opening new horizons for the heavy-ion community at CERN. During these few hours of pilot running, the ALICE experiment collected about 2 million events, sufficient not only to check the readiness of the detector for the long proton-ion run scheduled for the beginning of 2013, but also to perform a first analysis of the data and produce important physics results.

After the start of the heavy-ion physics programme in 2010, the LHC experiments obtained many striking results related to the formation of the hot and dense hadronic state of matter emerging from the collisions of lead nuclei. This state – the quark–gluon plasma (QGP) – is expected to manifest itself through various signatures, such as the suppression of high-energetic jets in the medium, collective particle motion, enhancement of strange-particle production and suppressed quarkonia production. In addition, surprising scaling effects were observed in the particle multiplicity compared with measurements at lower energies. However, given the complexity of the lead–lead (PbPb) colliding system, an important step in the quest for QGP lies in decoupling the effects of cold nuclear matter that arise at the initial stage of the collisions.

The proton–nucleus system represents the perfect benchmark for studying these effects because the colliding components are elementary and give rise to processes where the effects of the medium produced in the collision are either small or even totally absent. The collisions are also interesting because they allow the exploration of nuclear parton distributions in the region of small parton fractional momenta, which are so far unmeasured. Proton–nucleus collisions can therefore provide the data needed to understand better the properties of PbPb collisions at the energy of the LHC. The study of the dense initial state also provides access to a completely new QCD regime where the parton densities are expected to be saturated.

CCnew5_10_12

Using the newly acquired data, the ALICE collaboration has been able to measure the charged-particle multiplicity density in proton–lead (pPb) collisions at a centre-of-mass energy of √sNN = 5.02 TeV (ALICE collaboration 2012a). Figure 1 compares this measurement with two main groups of theoretical models. The first group consists of models that incorporate nuclear modification – for example, shadowing – of the initial parton distributions; the second includes various saturation models. While the current experimental and theoretical precision is not sufficient for a detailed comparison, the figure shows that the data are described best by the model where the gluon shadowing parameter (sg) is tuned to previous experimental data at lower energies. Saturation models predict much steeper dependence on the pseudorapidity, which is not confirmed by the measurement.

CCnew4_10_12

Another important result from the analysis of the proton–nucleus data concerns the charged-particle transverse-momentum spectrum and the corresponding nuclear-modification factor (ALICE collaboration 2012b). The latter is calculated using the proton–proton data at collision energies of 2.76 TeV and 7 TeV as reference (figure 2). The result clearly indicates little or no modification of the production of charged particles with transverse momentum greater than 2 GeV/c, thus confirming that the suppression of high-energy jets in PbPb collisions is not a result of cold nuclear-matter effects. The comparison with the available theoretical predictions suggests that the models require further development because they have difficulties in describing the multiplicity and the transverse-momentum spectrum simultaneously.

Measurement of photons stimulates quest for QGP temperature

One of the classic signals expected for a quark–gluon plasma (QGP) is the radiation of “thermal photons”, with a spectrum reflecting the temperature of the system. With a mean-free path much larger than nuclear scales, these photons leave the reaction zone created in a nucleus–nucleus collision unscathed. So, unlike hadrons, they provide a direct means to examine the early hot phase of the collision.

However, thermal photons are produced throughout the entire evolution of the reaction and also after the transition of the QGP to a hot gas of hadrons. In the PbPb collisions at the LHC, thermal photons are expected to be a significant source of photons at low energies (transverse momenta, pT, less than around 5 GeV/c). The experimental challenge in detecting them comes from the huge background of photons from hadron decays, predominantly from the two-photon decays of neutral pions and η mesons.

CCnew7_10_12

The ALICE experiment has measured photons produced in central PbPb collisions at a centre-of-mass energy per colliding nucleon pair, √sNN = 2.76 TeV, by reconstructing with the time-projection chamber the tracks of e+e pairs produced by the conversion of photons in the inner detectors. The same sample of photons was also used to measure the pT spectrum of neutral pions. The analysis found an excess of direct photons of around 15% for 1 < pT < 5 GeV/c compared with the calculated decay-photon yields from neutral pions, η mesons and other mesons, with a somewhat larger excess at higher pT.

Direct photons are defined as photons not coming from decays of hadrons, so photons from initial hard parton-scatterings (prompt photons and photons produced in the fragmentation of jets) – i.e. processes already present in proton–proton collisions – contribute to the signal. Indeed, for pT greater than around 4 GeV/c, the measured spectrum agrees with that for photons from initial hard scattering obtained in a next-to-leading-order perturbative QCD calculation. For lower pT, however, the spectrum has an exponential shape and lies significantly above the expectation for hard scattering, as the figure shows.

The inverse slope parameter measured by ALICE, TLHC = 304 ± 51 (stat.+syst.) MeV, is larger than the value observed in gold–gold collisions at √sNN = 0.2 TeV at Brookhaven’s Relativistic Heavy-Ion Collider (RHIC), TRHIC = 221 ± 19 (stat.) ± 19 (syst.) MeV. In typical hydrodynamic models, this parameter corresponds to an effective temperature averaged over the time evolution of the reaction. The measured values suggest initial temperatures well above the critical temperature of 150–160 MeV (approx. 1.8 × 1012 K) at which the transition between ordinary hadronic matter and the QGP occurs. The ALICE measurement also indicates that the LHC has produced the hottest piece of matter ever formed in a laboratory.

Leptons on the trail of the unexpected

Searches in LHC data that do not depend on specific theoretical models provide a valuable complement to optimized, model-dependent searches because they have the capacity to uncover hints of the completely unexpected. In this spirit, the ATLAS collaboration has recently looked for events with like-sign leptons and three or more leptons, using the full 2011 LHC data set of nearly 5 fb–1, in the pursuit of signs of new physics. Unfortunately, no excess events compared against the Standard Model have been observed. However, the analyses have provided the information needed to set limits on a range of models and to set limits on the production of doubly charged Higgs bosons.

Prompt like-sign lepton pairs are rarely produced in Standard Model processes but they may be produced by fourth-generation quarks, supersymmetry, universal extra dimensions or processes in non-Standard Model Higgs models or new models. A recent study by ATLAS selected isolated electrons and muons and divided the events into dielectron, dimuon, and electron-muon categories. This analysis yielded upper limits on the cross-section of anomalous production of like-sign lepton pairs ranging between 1.7 fb and 64 fb (ATLAS 2012a). An extension to the analysis set limits on the production of doubly charged Higgs bosons decaying to pairs of electrons or muons (ATLAS 2012b).

CCnew9_10_12

Events with three or more prompt leptons in the final state are also rare in the Standard Model. A recent search for multilepton events by ATLAS identified isolated electrons, isolated muons and hadronically decaying taus and found only 1827 events with three or more leptons. These events were divided into four categories; depending on whether or not a Z boson was reconstructed from two opposite-charge electrons or muons in the event, and whether or not a tau candidate was present.

The figure shows results for these four categories: the limits on the number of events from non-Standard Model sources have been calculated and converted into limits on the “visible cross-section”, i.e. the cross-section that is observable after event selection. The limits on the visible cross-section are given as a function of increasing lower bounds on the missing transverse momentum, a quantity that may be large in models with new physics. The smallest lower bound, “X”, is 0 GeV for the off-Z channels (no reconstructed Z) and 20 GeV for the on-Z channels (with reconstructed Z). Limits are shown for events with more than 100 GeV of transverse momenta for the jets in the event (HTjets); an upcoming publication includes the corresponding limits for lower values of HTjets and other variables of interest. These visible cross-section limits can be converted into upper limits on the cross-section for many specific models, including the doubly charged Higgs and new theories yet to come.

CMS homes in on the heaviest quark

The top quark is the heaviest point-like particle known. It weighs about as much as an atom of tungsten yet is an elementary building block of the Standard Model of particle physics. Its mass is one of the model’s important parameters and is directly related via radiative corrections to the masses of the W and Higgs bosons. Precise knowledge of the top quark’s mass is therefore extremely valuable to constrain theoretical models.

The CMS collaboration has measured the top-quark mass by exploiting all possible final states originating from different decays of W bosons produced in the decays of top quarks. Final states where the W boson decays into leptons are particularly “clean” (see figure). Such events are selected by requiring energetic jets in the central region of the CMS detector, of which at least one must be compatible with originating from a bottom quark (“b-tagged jet”), together with one or two isolated and high-energy leptons. The selected samples are extremely pure in top-quark-pair events, with estimated purities greater than 95% for events containing at least one electron or a muon.

CCnew11_10_12

For hadronically decaying W bosons, the reconstruction techniques make use of kinematic fits to improve the energy resolution and the likelihood methods that can handle the combinatorial ambiguities in finding the triplet of jets corresponding to the top-quark decay. The use of b-tagging helps considerably in constraining these ambiguities further. For dilepton events, the presence of two neutrinos accompanying the charged leptons from the W-boson decays requires alternative techniques.

All of the methods and channels used give consistent measurements of the top-quark mass. The results are now fully dominated by uncertainties other than statistical, with major contributions coming from the uncertainty associated with the jet-energy scale and how well the Monte Carlo simulations model the details of the top decay. The best single measurement of the mass of the top quark, from the e/μ+jets channel, results in a statistical uncertainty of 0.4 GeV and a systematic uncertainty of around 1 GeV.

The combined CMS measurement, accounting for correlations between uncertainties obtained in the individual channels, yields a total uncertainty of about 1 GeV. This result is already competitive (and in agreement) with the combined measurement from the CDF and DØ experiments at Fermilab’s Tevatron, as the figure shows. For a further reduction of the uncertainty, it will become important to employ novel measurement techniques.

CCnew12_10_12

The CMS collaboration has also measured the difference in mass between the top quark and its antiquark – an important test of the symmetry between matter and antimatter. This is done by splitting the sample of events with e/μ+jets into two subsamples with opposite lepton charges. The difference in quark–antiquark masses is compatible with zero with an uncertainty of about 0.5 GeV. This is the best precision on this mass difference to date.

After more than 15 years of precision top physics at the Tevatron, the baton in the race to understand nature’s heaviest quark has now passed to the LHC. With an uncertainty on the top-quark mass of 1 GeV, CMS is now at the forefront of precision physics in the top sector.

LHCb reports first 5σ observation of charm mixing

The large cross-section for charm production at the LHC, and the geometry and instrumentation of the LHCb detector, provide samples of charmed hadrons far larger than those accumulated by previous experiments. These allow the Standard Model to be tested by studying various interesting phenomena such as CP violation and mixing in D0 mesons.

The electroweak force can cause D0 mesons (consisting of a charm quark and an anti-down quark) to transform into their antiparticle, D0 (anti-charm and down), and back. Such “flavour oscillations” or “mixing” have been observed and studied in detail in K0, B0 and Bs0 mesons. In the charm system, however, the period of the oscillations is so long – over one hundred times the average lifetime of a D0 meson – that although the BaBar, Belle and CDF collaborations have reported strong evidence of the effect, none of them has been able to claim an unambiguous observation.

CCnew14_10_12

One of the best channels to search for charm mixing is the decay D0 → Kπ. The initial flavour can be identified by the charge of the accompanying pion in the decay D*+→D0π+ or D*–D0π. The mixing effect appears as a decay-time dependence of the ratio R between the number of reconstructed “wrong-sign” (WS) and “right-sign” (RS) processes: D0→K+π and D0→Kπ+, respectively, and their charge conjugates. The WS process can proceed either by a Cabibbo-suppressed decay or through flavour oscillation followed by a favoured decay. In the absence of mixing, R will be constant as a function of the D0 decay time, t, while, in the case of mixing, it is predicted to be an approximately parabolic function of t. Determining R in bins of t therefore allows a measurement of the mixing parameters, while also cancelling many potential sources of systematic uncertainty.

The figure shows the ratio WS/RS, measured by the LHCb experiment, as a function of decay time, from a total of 36,000 WS and 8.4 million RS decays selected from the 1.0 fb–1 of data recorded in 2011. The horizontal dashed line shows the no-mixing hypothesis; the solid line is the best fit to data when mixing is allowed. The clear time-dependence observed excludes the no-mixing hypothesis by 9.1σ. The oscillation parameters are determined with uncertainties about a factor two smaller than in previous measurements.

Since the Standard Model predictions for the mixing parameters have large uncertainties, the next step will be to focus on cleaner observables to search for possible contributions from new physics. In particular, LHCb is now well placed to investigate whether there is a CP-violating contribution to the oscillations, in contrast to the Standard Model expectation. This will be achieved by studying charm mixing in this and other decay channels and exploiting the large increase in data following the successful 2012 LHC run.

XMM-Newton discovers new source of cosmic rays

Researchers using the European X-ray astronomy satellite XMM-Newton have discovered a new source of low-energy cosmic rays in the vicinity of the Arches cluster, near the centre of the Milky Way. Their origin differs from that of higher-energy cosmic rays that originate in the explosions of supernovae.

Low-energy cosmic rays with kinetic energy less than half a billion electronvolts are not detected at Earth, since the solar wind prevents them from entering the heliosphere. Therefore little is known about their chemical composition and flux outside the solar system.

V Tatischeff, A Decourchelle and G Maurin, from the institutes of CNRS and CEA in France began by studying the X-ray emission that should theoretically be generated by low-energy cosmic rays in the interstellar medium. They then looked for signs of this in X-ray data collected by XMM-Newton since its launch in 1999. By analysing the properties of the X-ray emission of interstellar iron recorded by the satellite, they found the signature of a large population of fast-moving ions in the vicinity of the Arches cluster, about 100 light-years from the centre of the Milky Way. The stars in this cluster are travelling together at approximately 700,000 km/h. The cosmic rays are in all likelihood produced in the high-speed collision of the star cluster with a gas cloud in its path.

This is the first time that a major source of low-energy cosmic rays has been discovered outside the solar system. It shows that the shock waves of supernovae are not the only objects that can cause mass acceleration of atomic nuclei in the galaxy. These findings should make it possible to identify new sources of ions in the interstellar medium, and may lead to a better understanding of the effects of these energetic particles on star formation.

bright-rec iop pub iop-science physcis connect