Comsol -leaderboard other pages

Topics

Boosting sensitivity to new physics

The poster for BOOST2012

The LHC is a tremendously powerful tool built to explore physics at the tera-electron-volt scale. This year it is being operated with a centre-of-mass energy of 8 TeV, which is a little over half of the full design energy. This beats by a factor of four the previous world record held by Fermilab’s Tevatron, which shut down a year ago. For the study of ultrahigh-energy collisions, a second figure of merit of the machine – luminosity – is also of crucial importance. Here, the LHC is living up to its promise. In the first eight months of proton–proton operations in 2012, the ATLAS and CMS experiments have registered close to 10 fb–1 at 8 TeV, a data set similar to that collected during the entire 10-year Run II at the Tevatron.

In this uncharted territory at the energy frontier, known particles behave in unfamiliar ways. For the first time, the heaviest known particles – the W and Z bosons, the top quark and the recently discovered new boson – do not seem quite so heavy. Their rest masses (of the order 100 GeV) are small compared with the energy unleashed in the most energetic collisions, which can be up to several tera-electron-volts. Therefore, every so often these massive particles are produced with an enormous surplus of kinetic energy such that they fly through the laboratory at enormous speed.

A serious challenge

The velocity of these massive particles has implications for the way that they are observed in experiments. For particles produced with a large boost, the decay products (leptons or jets of hadrons) are emitted at small angles to the original direction of their parent. The full energy of the massive particle is deposited in a tiny area of the detector. Reconstructing and selecting these highly collimated topologies represents a serious challenge. For a sufficiently large boost, the two jets of particles that appear in hadronic two-body decays (W, Z, H → qq) cannot be resolved by standard reconstruction algorithms.

An approach pioneered by Michael Seymour, now at Manchester University, provides an interesting alternative by simply turning the problem round (Seymour 1994). Instead of trying to resolve two jets and adding up their momenta to reconstruct the parent particle, the technique is to reconstruct a single jet that contains the full energy of the decay. The fat jet containing the decay of a boosted object must then be distinguished from ordinary jets that are produced by the million at the LHC. This is achieved through an analysis of the jet’s internal structure. This alternative appears to be the most promising approach whenever the energy of the massive particle exceeds its rest mass by a factor of three or more. The boosted regime thus starts at an energy a little over 200 GeV for a W boson and at roughly 500 GeV for a top quark.

Dawn of the boosted era

The LHC is the first machine where boosted objects are a crucial part of the physics programme. A more quantitative grasp of exactly how the LHC crosses the threshold of the boosted regime is obtained by comparing the expectation in the Standard Model for the production rate of top quarks – the heaviest known particle – at past, present and future colliders.

Since the discovery of the top quark in 1995, the Tevatron has produced tens of thousands of these particles. A large majority of these were produced approximately at rest; only two dozen or so top quark pairs had a mass exceeding 1 TeV. By contrast, the LHC is a real top factory. In 2012 alone, it has already produced more than 20 times as many top-quark pairs as the Tevatron had in its lifetime. At the LHC, most top-quark pairs are still produced close to threshold but production in the boosted regime increases by several orders of magnitude. Several tens of thousands of top-quark pairs will be produced this year with mtt > 1 TeV.

Expected number of events

Impressive as these numbers may be, these first years mark just the start of a long programme. After a shutdown in 2013–2014, the LHC should emerge in its full glory, with protons colliding close to the design energy of 14 TeV and the experiments collecting tens of inverse femtobarns of data each year. Boosted top quarks will be produced by the millions in the next phase of the LHC and a sizeable sample of top quarks with tera-electron-volt energies is expected.

Over the past few years, much work has been done to address the experimental challenges inherent in the new approach for boosted objects. Using the substructure of jets requires a precise understanding of how they are formed. Sophisticated new algorithms to identify boosted objects – W-taggers, top-taggers, Higgs-taggers – have been put forward and developed further by the LHC experiments.

The potential of these new methods to improve the sensitivity of LHC analyses has been estimated by using Monte Carlo simulations. One obvious area where tools tailored to boosted topologies might make a difference is in searches for signals of physics beyond the Standard Model in the most energetic collisions. Several such cases have been studied in detail. A significant pay-off in physics return is expected in resonance searches in the tt mass spectrum and studies of diboson production at high energy. Boosted techniques may also be applied to the high-energy tails of continuum production in the Standard Model. In what has become a seminal paper, the seemingly hopeless Higgs search in the WH, H → bb channel was resurrected by requiring that the Higgs boson is produced with moderate boost (Butterworth et al. 2008).

BOOST2012

By bringing together key theorists and experimentalists every year, a series of workshops known as BOOST offers a forum for discussion of the progress in this fast-moving field. The first of these at SLAC (2009) and in Oxford (2010) focused on Monte Carlo studies that laid the foundations for what was to come. At Princeton in 2011, the first measurements on LHC data of jet substructure were shown, as well as candidate events for the world’s first boosted top quarks. The display of one of these was chosen as the logo for the latest workshop, BOOST2012, organized by the Instituto de Física Corpuscular (IFIC) in Valencia. Held near the Mediterranean in late July – soon after the historic announcement at CERN of the discovery of a Higgs-like boson – this latest workshop definitely held the promise of becoming the “hottest” BOOST event so far. The 80 or so participants definitely did not let the organizers down.

A lively debate arose in the session centred on attempts to predict the invariant mass of energetic jets

A lively debate arose in the session centred on attempts to predict the invariant mass of energetic jets, comparing them with the more sophisticated measurements that have become available this year. Experimentalists and theorists joined efforts to develop new techniques to deal with the impact of the 30 overlapping collisions that occur every time that the LHC bunches cross. The recent discovery at CERN fuelled the discussion on the potential of these techniques to help isolate a Higgs signal in the crucial bb decay channel. However, perhaps the most exciting results were presented in the session on applications of these new ideas to searches for new physics with top quarks at the LHC.

Speakers from the ATLAS and CMS collaborations reviewed their experiments’ searches for top-quark pair production through processes not present in the Standard Model. Some of these use the classical scheme to reconstruct top quarks, where the hadronic top-quark decay (t → Wb → b qq)) is reconstructed by looking for three jets and then combining their four-vectors. Other searches adopt the “boosted” approach and reconstruct highly boosted top quarks as a single jet. While all searches have yielded negative results – reconstructed tt mass spectra following the Standard Model template to a frustrating precision – an evaluation of their relative sensitivity yields an encouraging conclusion. In both experiments, searches employing novel techniques specifically designed for boosted top-quark decay topologies are found to be considerably more sensitive than their classical counterparts in the high-mass region. This was expected from Monte Carlo studies, but these analyses show that the systematic uncertainties in the description of jet substructure, as well as the impact of pile-up on the experiments’ performance, are under control. In that sense, seeing these excellent results so early in the LHC programme constitutes a real proof of principle for this new approach.

The LHC produces – for the first time in the laboratory – large numbers of highly boosted heavy Standard Model particles. Results presented at BOOST2012 show that the development of new tools is on track to extract the maximum knowledge from the most energetic collisions. After careful commissioning and with conservative estimates of the uncertainties that affect this new approach, the first analyses employing boosted techniques to search for tt resonances clearly outperform their classical counterparts. These results are a milestone for the people in the field. The boosted paradigm is clearly ready to take on a major role in the LHC physics programme.

• The author would like to thank Gavin Salam for his useful comments in the preparation of this document

ESO and CERN – 50 years later

CCvie1_08_12

1948: The 5 m Hale telescope is inaugurated in Palomar, California. 1954: At the instigation of Jan Oort and Walter Baade, a group of renowned European astronomers meets to discuss how, by pooling the efforts of several countries, Europe could rise to the challenge and keep an important place in astronomical research; Jan Bannier, president of the CERN Council, is also present. A statement is adopted: “There is not a more urgent task for astronomers than to install powerful instruments in the southern hemisphere, and in particular a telescope … of at least 3 m.” But the scars of the Second World War are there and it will take several years of discussion before, on 5 October 1962, five governments (Belgium, France, the Federal Republic of Germany, the Netherlands and Sweden) sign the convention that creates the European Southern Observatory, ESO. The convention was drafted by Bannier, largely adapted from the CERN convention in its constitutional set-up, its financial basis and its personnel regulations (ESO and CERN: a tale of two organizations). Thus, in a sense, ESO is a younger sibling of CERN.

Soon, it was decided to establish the observatory at a site in Chile, in the Atacama Desert, chosen for its large proportion of clear nights and its excellent sky quality. A suitable piece of land was purchased at La Silla, close to La Serena. By 1969, a number of 1-m-class telescopes were in operation. Attention then focused on the construction of a 3.6 m telescope. The young organization had not yet mastered the skills necessary for such an endeavour and problems appeared on many fronts. CERN offered its help and soon the ESO Telescope Project Division moved to the CERN. A participant in the preceding discussions, CERN’s Kees Zilverschoon reported that “practically everyone … emphasized the importance of the collaboration between astronomy and high-energy physics [and] common technical developments … and the political aspect: formation of a ‘Communauté scientifique européenne’ .” This was long before the discussions on a European area of research started at the political level. With the help of some CERN engineers, the 3.6 m telescope was completed by 1976. It is still in use today, in particular for the successful search for extra-solar planets with the HARPS spectrometer.

ESO was offered new headquarters in Garching by the German government, settling there in 1980. By then, it had an excellent set of experienced engineers and in 1989 deployed a revolutionary 3.5 m telescope, the New Technology Telescope (NTT). This introduced “active optics” in which the effects of gravity, winds and temperature on image quality are counteracted by controlling the shape of the primary mirror and the position of the secondary mirror.

Even before the first light of the NTT, ESO had begun the Very Large Telescope (VLT) project. It all started in December 1977 with a lively conference at CERN on “Optical Telescopes of the Future”. Detailed studies led to the selection of an array of four telescopes of 8.2 m aperture and with active optics, with the NTT serving as a prototype for the construction of the VLT. An impressive suite of first- and second-generation instruments, most of them developed in national laboratories, have been placed at the 11 available foci, while the 12th is reserved for visitor instruments. The second ESO observatory, on Mt Paranal – with its four large VLT telescopes, four 1.8 m telescopes dedicated to interferometry and two telescopes devoted to surveys of the sky in the optical and the infrared – is now the most productive observatory in the world, allowing major advances in virtually all fields of astrophysics.

It is in the same vicinity, on Mt Armazones, that ESO plans to erect its Extremely Large Telescope (ELT), based on a novel concept that features five mirrors in sequence instead of the usual two, with a segmented primary mirror 39 m in diameter. Corrections for blurring owing to turbulence in the atmosphere, which are today made with small deformable mirrors at the level of the instruments (“adaptive optics”), will – in the ELT – be made partially by two of the five mirrors of the telescope itself.

Following an agreement signed by ESO and the US National Science Foundation in 2003, which was soon joined by the National Astronomical Observatory of Japan in collaboration with Taiwan, ALMA, an ambitious millimetre and submillimetre observatory featuring 66 antennas has been under construction for the past few years on the Chajnantor plateau in the Atacama, at an altitude of 5000 m. The inauguration will take place next March but early science, with 16 telescopes, is already bringing highly exciting results (see, for example, ALMA tastes sugar around a Sun-like star).

In 2000, ESO fostered the creation of EIROforum, a partnership of seven European research organizations with the mission of combining the resources, facilities and expertise of its members to support European science in reaching its full potential. Chaired most recently by CERN in 2011–2012, it has just been joined by a new member, the European X-ray free-electron laser project, XFEL.

ESO and CERN share a range of scientific interests and have held stimulating joint conferences in the past, the last ones also involving ESA. Today, cosmology, dark matter, dark energy, high-energy gamma rays, neutrinos, gravitational waves, general relativity and processes in the vicinity of black holes are all hot topics for both communities and would deserve a new joint conference in the near future.

Time Machines

By Stanley Greenberg; Introduction by David C Cassidy
Hirmer Verlag
Hardback: €39.90 SwFr53.90 £39.95 $59.95

CCboo1_07_12

The American photographer Stanley Greenberg travelled 130,000 km over five years to create the 82 black-and-white photographs included in this large-format book. They are a record of the extraordinary and sometimes surreal complexity of the machinery of modern particle physics. From a working replica of an early cyclotron to the LHC, Greenberg covers the world’s major accelerators, laboratories and detectors. There are images from Gran Sasso, Super-Kamiokande, Jefferson Lab, DESY and CERN, as well as Fermilab, SLAC and LIGO, Sudbury Neutrino Observatory, IceCube at the South Pole and many more.

The LUNA experiment at Frascati is like a giant steel retort-vessel suspended in the air; a LIDAR installation at the Pierre Auger Cosmic Ray Observatory in Argentina is a fantastically hatted creature from outer space bearing the warning “RADIACION LASER”; and the venerable 15-foot bubble chamber sits on the prairie at Fermilab like a massive space capsule that landed in the 1960s. (Who knows where its occupants might be now?)

Not a single person is seen in these beautiful images. They are clean, almost clinical studies of ingenious experiments and intricate machines and they document a world of pipes, concrete blocks, polished steel, electronics and braided ropes of wires. Greenberg has said that his earlier books, such as Invisible New York – which explores the city’s underbelly, its infrastructure, waterworks and hidden systems – are “about how cities and buildings work”, whereas Time Machines is about “how the universe works”. More accurately, perhaps, it is about the things that we build to help us understand how the universe works – but here the builders are invisible, like the particles that they are studying.

In a book whose photographs clearly demonstrate the global nature of particle physics, David Cassidy, author of an excellent biography of Werner Heizenberg, includes a one-sided introduction, concentrating on US labs and achievements. Accelerators are “prototypically American” and his main comment on the LHC is that the US has contributed half a billion dollars to it and that Americans form its “largest national group”. There are also inaccuracies: electroweak theory was confirmed by the discovery of the W and Z bosons at CERN in 1983, not 1973; and the top quark discovery was announced in 1995, not 2008. The introduction does not do justice to Greenberg’s excellent and wide-ranging photography but, fortunately, nor does it detract from it.

Pierre-Gilles de Gennes: A Life in Science

By Laurence Plévert
World Scientific
Paperback: £32
E-book: £41

CCboo2_07_12

Pierre-Gilles de Gennes obtient le prix Nobel de physique en 1991 « pour avoir découvert que les méthodes développées dans l’étude des phénomènes d’ordre s’appliquant aux systèmes simples peuvent se généraliser à des formes plus complexes, cristaux liquides et polymères ». Ni invention ni découverte, c’est un curieux intitulé. Le comité semble honorer un homme plus qu’une contribution identifiée. De fait, la vie de de Gennes se lit comme une épopée. Il naît en 1932 d’une famille alliant la banque et l’aristocratie. Ses parents se séparent, il est doublement choyé. La guerre éclate, c’est l’occasion de vacances alpestres. Cette enfance hors du commun lui apprend discipline et curiosité et lui donne une grande confiance en lui.

Attiré par les sciences à 15 ans, il surmonte les années difficiles de la classe préparatoire en jouant dans un orchestre de jazz. Reçu premier à l’Ecole normale supérieure, il commence la vie libre de normalien, se mariant et devenant papa avant l’agrégation. Il se passionne pour la mécanique quantique et la théorie des groupes, qu’il décortique dans les livres. Feynman est son modèle. L’intuition doit rester souveraine, et il l’applique aussi en politique où il rejette les modes de l’époque. Il vit une révélation avec l’école d’été des Houches, où il rencontre Pauli, Peierls… Les deux mois les plus importants de sa vie, dit-il. Sa vocation pour la physique s’y confirmera, mais quelle voie suivre ? La physique nucléaire ? « J’ai l’impression que personne ne sait décrire une interaction sinon en ajoutant des paramètres de manière ad hoc ».

Sorti de l’ENS, il intègre la division théorique de Saclay. Après son service militaire, il devient professeur à Orsay à l’âge de 29 ans. On lui laisse carte blanche, il s’attaque à la supraconductivité, montant un laboratoire à partir de rien. Se laissant guider par l’imagination, il mélange expérience et théorie. Aux plus jeunes, il insuffle l’enthousiasme, son charisme opère sur tous.

Il quitte Orsay en 1971, appelé au prestigieux Collège de France, pour y créer son propre laboratoire. Il y développe la science de « la matière molle », comprenant les bulles et les sables, les gels, les polymères… Théoricien du pratique, il prône une forte collaboration avec l’industrie. Pluridisciplinaire avant la lettre, il exploite les analogies suggérées par sa grande culture scientifique.

Dans son parcours sans faute, une hésitation apparaît. « Au milieu du chemin de sa vie », il sent le défi de l’âge. Il le relève allègrement fondant une seconde famille, en maintenant un bon rapport avec la première où vivent trois grands enfants. Sa femme ne s’insurge pas. Sa vie privée est aussi fertile que sa carrière, et trois nouveaux enfants naîtront.

Arrive l’heure des distinctions. Il est élu à l’Académie des Sciences, il reçoit la médaille d’or du CNRS, la Légion d’honneur, on lui propose un ministère. Tout en demeurant au Collège de France, il est appelé à la direction de l’ESPCI, qu’il remodèle à son goût, il s’y fait une réputation de despote. C’est un grand patron qui assume sa fonction. De fait, son autorité naturelle suscite chez ses collaborateurs une crainte sacrée. L’apothéose que représente le prix Nobel lui permet d’appliquer ses idées avec encore moins de retenue. Grand communicateur, il popularise ses idées à la télévision.

Un cancer se déclare, il s’accroche à ses activités. Retraité du Collège de France, il poursuit sa vie de recherche à l’Institut Curie dans le domaine des neurosciences. Il meurt en 2007 après une dure bataille.

Pierre-Gilles de Gennes fut un homme de convictions. Parfois décrié pour ses prises de position, il ne craint pas de secouer les habitudes en s’attaquant aux structures sclérosées : « L’université a besoin d’une révolution. » Autre cheval de bataille : la « Big Science » ; il s’oppose au laboratoire de rayonnement synchrotron Soleil et au projet ITER. Humaniste, il publie un délicieux tableau de caractères à la manière de la Bruyère, et il avoue : « J’ai tendance à croire que notre esprit a des besoins autant rationnels qu’irrationnels. »

Bouillonnant d’idées, auteur de 550 publications, homme d’influence qui s’exprime de manière franche, il ose dire : « Il faut accélérer la mort lente de champs épuisés comme la physique nucléaire », et il remarque : « Quand j’ouvrais PRL en 1960, je trouvais chaque fois une idée révolutionnaire, aujourd’hui j’arrive à 2 ou 3 idées par an, dans un journal devenu 5 fois plus épais. » Il est vrai que les idées neuves se font rares. Nous vivons sur l’acquis d’anciennes avancées théoriques, et le Higgs découvert récemment a été postulé il y a 50 ans. D’où le le sentiment dérangeant que le progrès avance plus laborieusement.

Pierre-Gilles de Gennes fut un esprit fertile et passionné, mais il vécut aussi dans une période favorable, offrant des domaines vierges permettant de multiplier les recherches. Une carrière comme la sienne semble impossible aujourd’hui, les spécialités poussées à l’extrême étouffant les initiatives individuelles.

La biographie, très bien écrite par la journaliste Laurence Plévert, est truffée d’anecdotes, elle se lit comme un roman qui emplit le lecteur d’un optimisme renouvelé sur les potentialités de l’aventure humaine et de la recherche fondamentale.

« Renaissance man », dit la quatrième de couverture ; j’oserai comparer Pierre-Gilles de Gennes à un monarque éclairé façon condottiere, ce qui ne contredit pas l’aphorisme d’un journaliste résumant l’attrait de l’homme : « Il est quelqu’un qu’on aimerait avoir comme ami, pour partager le privilège de se sentir un instant plus intelligent. »

This book is a translation of the original French edition Pierre-Gilles de Gennes. Gentleman physicien (Belin, 2009).

Tevatron experiments observe evidence for Higgs-like particle

CCnew1_07_12

The CDF and DØ collaborations at Fermilab have found evidence for the production of a Higgs-like particle decaying into a pair of bottom and antibottom quarks, independent of the recently announced Higgs-search results from the LHC experiments. The result, accepted for publication in Physical Review Letters, will help in determining whether the new particle discovered at the LHC is the long-sought Higgs particle predicted in the Standard Model.

Fermilab’s Tevatron produced proton–antiproton collisions until its shutdown in 2011; the LHC produces proton–proton collisions. In their analyses, the teams at both colliders search for all potential Higgs decay modes to ensure that no Higgs-boson event is missed. While the Standard Model does not predict the mass of the Higgs boson, it does predict that the Standard Model Higgs boson favours decaying into a pair of b quarks if the mass is below 135 GeV. A heavier Higgs would decay most often into a pair of W bosons.

The CDF and DØ teams have analysed the full Tevatron data set – accumulated over the past 10 years. Both collaborations developed substantially improved signal and background separation methods to optimize their search for the Higgs boson, with hundreds of scientists from 26 countries actively engaged in the search.

After careful analysis and multiple verifications, on 2 July CDF and DØ announced a substantial excess of events in the data beyond the background expectation in the mass region between 120 GeV and 135 GeV, which is consistent with the predicted signal from a Standard Model Higgs boson. Two days later, the ATLAS and CMS collaborations announced the observation in collisions at the LHC of a new boson with a mass of about 125 GeV.

At both of the Tevatron and the LHC, b jets are produced in large amounts, drowning out the signal expected when a Standard Model Higgs boson decays to two b quarks. At the Tevatron, the most successful way to search for a Higgs boson in this final state is to look for those produced in association with a W or Z boson. The small signal and large background require that the analysis includes every event that is a candidate for a Higgs produced with a W or Z boson. Furthermore, the analysis must separate the events that are most signal-like from the rest.

In the past two years, the CDF and DØ Higgs-search analysis teams improved the expected Higgs sensitivity of these experiments by almost a factor of two by separating the analysis into multiple search channels, adding acceptance for final decay products as well as developing innovative ways for improving particle-identification methods. Combined with a Tevatron data set of 10 fb–1, these efforts led to the extraction of about 20 Higgs-like events that are not compatible with background-only predictions. These events are consistent with the production and decay of Higgs bosons created by the Tevatron. The signal has a statistical significance of 3.1 σ.

A day to remember

On 4 July, particle physicists around the world eagerly joined many who had congregated early at CERN to hear the latest news on the search for the Higgs boson at the LHC (4 July: a day to remember). It was a day that many will remember for years to come. The ATLAS and CMS collaborations announced that they had observed clear signs of a new boson consistent with being the Higgs boson, with a mass of around 126 GeV, at a siginificance of 5 σ. In this issue of CERN Courier the two collaborations present their evidence (Discovery of a new boson – the ATLAS perspective and Inside story: the search in CMS for the Higgs boson) and CERN’s director-general reflects on broader implications (Viewpoint: an important day for science). There was further good news from Fermilab with new results on the search for the Higgs at the Tevatron, described above.

Proton run for 2012 extended by seven weeks

CCnew2_07_12

An important piece of news that was almost lost in the excitement of the Higgs update seminar on 4 July is that the LHC proton run for 2012 is to be extended. On 3 July, a meeting between CERN management and representatives from the LHC and the experiments discussed the merits of increasing the data target for this year in the light of the announcement to be made the following day (4 July: a day to remember). The conclusion was that an additional seven weeks of running would allow the luminosity goal for the year to be increased from 15 to 20 fb–1. This should give the experiments a good supply of data to work on during the LHC’s first long shut-down as well as allow them to make progress in determining the properties of the new particle.

The original schedule foresaw proton running ending on 16 October, with a proton–ion run planned for November. In the preliminary new schedule, proton running is planned to continue until 16 December, with the proton–ion run starting after the Christmas stop on 18 January 2013 and continuing until 10 February.

Auger determines pp inelastic cross-section at √s = 57 TeV

CCnew3_07_12

Ultra-high-energy cosmic-ray particles constantly bombard the atmosphere at energies far beyond the reach of the LHC. The Pierre Auger Observatory was constructed with the aim of understanding the nature and characteristics of these particles using precise measurements of cosmic-ray-induced extensive air showers up to the highest energies. These studies allow Auger to measure basic particle interactions, recently in an energy range equivalent to a centre-of-mass energy of √s = 57 TeV.

The structure of an air shower is complex and depends in a critical way on the features of hadronic interactions. Detailed observations of air showers in combination with astrophysical interpretations can provide specific information about particle physics up to √s = 500 TeV. This corresponds to an energy of 1020 eV for a primary proton in the laboratory system.

The depth in the atmosphere at which a cosmic-ray air shower reaches its maximum size, Xmax, correlates with the atmospheric depth at which the primary cosmic-ray particle interacted. The distribution of the measured Xmax values for the most deeply penetrating showers exhibits an exponential tail, the slope of which can be directly related to the interaction length of the initiating particle. This, in turn, provides the inelastic proton–air cross-section. The proton–proton (pp) cross-section is then inferred using an extended Glauber calculation with parameters derived from accelerator measurements that have been extrapolated to cosmic-ray energies. This Auger analysis is an extension of a method first used in the Fly’s Eye experiment in Utah (Baltrusaitis et al. 1984).

The composition of the highest-energy cosmic rays – whether they are protons or heavier nuclei – is not known and the purpose of the Auger analysis is to help in understanding it. The analysis targets the most deeply penetrating particles and so is rather insensitive to the nuclear mix. As long as there are at least some primary protons, then it is their cross-section that is measured. Moreover, to minimize systematic uncertainties, the Pierre Auger collaboration has chosen the cosmic-ray energy range of 1018–1018.4 eV (√sNN ˜ 57 TeV) in which protons appear to constitute a significant contribution to the overall flux. The largest uncertainty arises from a possible helium contamination, which would tend to yield too large a proton inelastic cross-section.

The figure shows the experimental result, which is to be published in Physical Review Letters (Abreu et al. 2012). It confirms the cross-section extrapolations implemented in interaction models that predict a moderate growth of the cross-section beyond LHC energies and is in agreement with the ln2(s) rise of the cross-section expected from the Froissart bound.

Heavy-ion jets go with the flow

The studies of central heavy-ion collisions at the LHC by the ALICE, ATLAS and CMS experiments show that partons traversing the produced hot and dense medium lose a significant fraction of their energy. At the same time, the structure of the jet from the quenched remnant parton is essentially unmodified. The radiated energy reappears mainly at low and intermediate transverse momentum, pT, and at large angles with respect to the centre of the jet cone. The ALICE collaboration has studied this pT region in PbPb collisions at a centre-of-mass energy √sNN = 2.76 TeV by using two-particle angular correlations, with some interesting results.

CCnew5_07_12

In the analysis, the associated particles are counted as a function of their difference in azimuth (Δφ) and pseudorapidity (Δη) with respect to a trigger particle in bins of trigger transverse momentum, pT,trig, and associated transverse momentum, pT,assoc. With the aim of studying potential modifications of the near-side peak, correlations independent of Δη are subtracted by an η-gap method: the correlation found in 1 < |Δη| < 1.6 (as a function of Δφ) is subtracted from the region in |Δη| < 1. Figure 1 shows an example in one pT bin: only the near-side peak remains, while by construction the away-side (not shown) is flat.

ALICE studies the shape of the near-side peak by extracting both its rms value (which is a standard deviation, σ, for a distribution centred at zero) in the Δη and Δφ directions and the excess kurtosis (a statistical measure of the “peakedness” of a distribution). The near-side peak shows an interesting evolution towards central collisions: it becomes eccentric.

CCnew6_07_12

Figure 2 presents the rms as a function of centrality in PbPb collisions as well as the one for pp collisions (shown at a centrality of 100). Towards central collisions the σ in Δη (lines) increases significantly, while the σ in Δφ (data points) remains constant within uncertainties. This is found for all of the pT bins studied, from 1 < pT,assoc < 2 GeV/c, 2 < pT,trig < 3 GeV/c to 2 < pT,assoc < 3 GeV/c, 4 < pT,trig < 8 GeV/c (Grosse-Oetringhaus 2012).

The observed behaviour is qualitatively consistent with a picture where longitudinal flow distorts the jet shape in the η-direction (Armesto et al. 2004). The extracted rms and also the kurtosis (not shown here) are quantitatively consistent (within 20%) with Monte Carlo simulations with A MultiPhase Transport Code (AMPT) (Lin et al. 2005). This Monte Carlo correctly reproduces collective effects such as “flow” at the LHC, which stem from parton–parton and hadron–hadron rescattering in the model.

This observation suggests an interplay of the jet with the flowing bulk in central heavy-ion collisions at the LHC. The further study of the low and intermediate pT region is a promising field for the understanding of jet quenching at the LHC, which in turn is a valuable probe of the fundamental properties of quark–gluon plasma.

XENON100 sets record limits

The XENON collaboration has announced the result of analysis of data taken with the XENON100 detector during 13 months of operation at INFN’S Gran Sasso National Laboratory. It provides no evidence for the existence of weakly interacting massive particles (WIMPs), the leading candidates for dark matter. The two events observed are statistically consistent with one expected event from background radiation. Compared with their previous result from 2011, the sensitivity has again been improved by a factor of 3.5. This constrains models of new physics with WIMP candidates even further and it helps to target future WIMP searches.

XENON100 is an ultrasensitive device. It uses 62 kg of ultrapure liquid xenon as a WIMP target and simultaneously measures ionization and scintillation signals that are expected from rare collisions between WIMPs and the nuclei of xenon atoms. The detector is operated deep underground at the Gran Sasso National Laboratory, to shield it from cosmic rays. To avoid false events occurring from residual radiation from the detector’s surroundings, only data from the inner 34 kg of liquid xenon are taken as candidate events. In addition, the detector is shielded by specially designed layers of copper, polyethylene, lead and water to reduce the background noise even further.

In 2011, the XENON100 collaboration published results from 100 days of data-taking. The achieved sensitivity already pushed the limits for WIMPs by a factor 5 to 10 compared with results from the earlier XENON10 experiment. During the new run, a total of 225 live days of data were accumulated in 2011 and 2012, with lower background and hence improved sensitivity. Again, no signal was found.

The two events observed are statistically consistent with the expected background of one event. The new data improve the bounds to 2.0 × 10–45 cm2 for the elastic interaction of a WIMP mass of 50 GeV. This is another factor of 3.5 compared with the earlier results and cuts significantly into the expected WIMP parameter region. Measurements are continuing with XENON100 and a still more sensitive, 100-tonne experiment, XENON1T, is currently under construction.

The XENON collaboration consists of scientists from 15 institutions in China, France, Germany, Israel, Italy, the Netherlands, Portugal, Switzerland and the US.

bright-rec iop pub iop-science physcis connect