Comsol -leaderboard other pages

Topics

LHC sets world record beam intensity

CCnew1_05_11

Just before midnight on 21 April, the LHC set a new world record for beam intensity at a hadron collider when its beams collided with a peak luminosity of 4.67 × 1032 m–2s–1. This exceeds the previous world record of 4.024 × 1032 cm–2s–1, which was set by Fermilab’s Tevatron collider in 2010, and marks an important milestone in LHC commissioning. The new record, made with 480 bunches per beam, lasted only a couple of days before collisions with 768 bunches per beam delivered around 8.4 × 1032 cm–2s–1. By the time a period of machine development began in the first week in May, the integrated luminosity for ATLAS and CMS for 2011 had reached more than 250 pb–1.

Early in April, a period of “scrubbing” took place to improve the surface characteristics of the beam pipe. This run saw more than 1000 high-intensity bunches per beam circulating at 450 GeV with 50 ns spacing. Given the potential luminosity performance (more bunches, higher bunch intensity from the injectors), the decision was taken to continue the 2011 physics run with this bunch spacing.

For 50 ns injection into the LHC, the Super Proton Synchrotron (SPS) takes batches of 36 bunches from the Proton Synchrotron. Since the scrubbing run, the LHC has passed through 228, 336, 480 and 624 bunches per beam to reach the latest total of 768. Each step-up of 144 bunches represents two extra injections of 72 bunches (2 × 36) from the SPS. This is a considerable amount of beam power and the injection process needs to be carefully tuned and monitored. A few days is spent delivering physics after each step-up to check the performance of the machine and make sure that no intensity-dependent effects are compromising machine protection.

The push-up in the number of bunches will continue towards a potential maximum for the year of around 1400.

AMS takes off

CCnew2_05_11

The space shuttle Endeavour launched successfully on its 25th and final spaceflight on 16 May at 08.56 a.m. local time. It carried the Alpha Magnetic Spectrometer (AMS-02), designed to operate as an external module on the International Space Station (ISS). Endeavour was scheduled to dock with the ISS nearly 48 hours later, on 18 May, for a 16-day mission. A problem with an auxilliary power unit had led to the last-minute postponement of the earlier planned launch on 29 April.

AMS will study the universe and its origin by searching for antimatter and dark matter while performing precision measurements of the composition and flux of cosmic rays. There will be more about the mission in the next issue of CERN Courier.

LHCf measures very forward photons

CCnew3_05_11

The LHCf collaboration has measured the production spectrum of photons using the highest-energy accelerator beams in the world, at CERN’s LHC machine. With proton beams at 3.5 TeV the total collision energy is equivalent to when protons of 2.5 × 1016 eV strike a stationary target, which is an energy region that is of interest to cosmic-ray physicists.

The LHCf experiment consists of two independent calorimeters installed on either side of the ATLAS interaction point at the LHC. Using data obtained in 2010 during proton runs at 7 TeV in the centre-of-mass, the collaboration has measured the photons emitted into two very forward regions, that is, close to zero degrees to the beam direction, in the pseudo-rapidity ranges from 8.81 to 8.99 and from 10.94 to infinity (Adriani et al. 2011). To minimize contamination from beam-gas background and pile-up events, the team chose a limited but best dataset corresponding to an integrated luminosity of 0.68 nb–1. After selecting single photon-like events in common pseudo-rapidity ranges, they obtained consistent energy spectra from two detectors.

The collaboration has compared its data with the predictions from various hadron interaction models used in the study of cosmic-ray air showers, together with PYTHIA 8.145, which is popular in the high-energy-physics community. As the figure shows, there is significant deviation between the data and model above 2 TeV in the higher rapidity region. Three well known models – DPMJET 3.04, QGSJET II-03 and PYTHIA 8.145 – predict significantly higher photon yields than the experiment finds above 2 TeV, but agree reasonably well with the data at 0.5–1.5 TeV. The other models – SIBYLL 2.1 and EPOS 1.99 – do not predict such high photon yields but predict a smaller yield over the whole energy range. The difference is less marked in the lower rapidity region, but nevertheless none of the models shows perfect agreement with data.

The energy spectra of collision products at high-rapidities are crucial to understand correctly the development of cosmic-ray-induced air showers. Following recent notable improvements in observations of ultra-high-energy cosmic rays (UHECR), it is becoming increasingly important to reduce the uncertainty. The impact of the current LHCf results on cosmic-ray physics is now under study as the collaboration works together with theorists on further analyses of the data on neutral pions and neutrons. The data will also cast light on the energy dependence of hadron interactions and the extrapolation into the UHECR energy range. At the same time, the collaboration is studying the feasibility of data-taking during ion collisions (ion–ion and/or proton–ion), which would give a better simulation of cosmic-ray-air collisions.

CMS measures single-top production at 7 TeV

CCnew4_05_11

The top quark was first observed in the mid-1990s by the CDF and DØ experiments at the Tevatron collider at Fermilab. These were produced and observed as top-antitop pairs, but it was not until 2009 that the two experiments reported the observation of single-top quarks. The ATLAS and CMS experiments at the LHC reported the first signs of top-antitop last summer, just a few months after the first collisions at a centre-of-mass energy of 7 TeV. Now, CMS has completed two complementary single-top analyses using the full data sample of 2010; that is, an integrated luminosity of 36 pb–1.

Such single tops are much more difficult to observe experimentally because they are produced at a lower rate and have a less distinctive signature compared with top-antitop pairs. This makes it more difficult to distinguish single-top events from the background physics processes.

In their recent analyses, the CMS collaboration focused on the production of single top via the so-called “t-channel W boson exchange” process in which the top quark emerges from the exchanged W together with a light quark. They observed the top quark through its decay into a W boson and a b-quark. The W boson was detected in turn through its decay to a charged lepton (electron or muon) plus a neutrino, while the jet from the b-quark was tagged by the high-precision silicon tracking detectors in CMS.

The two analyses establish the observation of single-top production by CMS with a statistical significance of about 3.5 σ. One analysis exploited the angular characteristics between the light quark jet and final-state lepton, shown in the figure, while the other used a multivariate analysis technique to separate the signal from the background. Data-driven background estimates were used in both these analyses. The two analysis methods were combined to yield a cross-section for single-top production in proton–proton collisions at 7 TeV of 83.6± 29.8 (stat+syst.)± 3.3 (lumi.) pb. This result agrees well with the rate predicted by the Standard Model.

Such a rapid detection of the elusive single top, despite the challenging background conditions, shows that the experiments are well prepared to detect and measure signals of new physics. These may soon manifest themselves as the LHC continues to produce ever more data at the high-energy frontier.

ATLAS explores new frontiers with high-pT jet measurements

CCnew5_05_11

The ATLAS collaboration has announced its latest cross-section measurements of inclusive jet and dijet production, which involve final states containing at least one or two jets, respectively. Each jet is the result of a parton (quark or gluon) that emits radiation through the strong force, creating a collimated spray of hadrons.

These high-pT jet measurements confront QCD, the theory of the strong force, in a large and previously unexplored kinematic region in jet transverse-momentum and dijet invariant-mass. The measurements constitute one of the most stringent tests of QCD ever performed. They probe predictions of perturbative QCD, constrain the density of partons within the proton and are sensitive to new physics scenarios, such as quark compositeness, which may become apparent at very short distance scales.

CCnew6_05_11

The analysis uses the full data sample collected in LHC proton–proton collisions at 7 TeV during 2010, corresponding to an integrated luminosity of 37 pb–1. The results extend far beyond the kinematic reach achieved at the Tevatron, as do recent results from CMS (CMS collaboration 2011). The ATLAS results extend to 1.5 TeV in jet transverse-momentum (as in figure 1) and to 4.1 TeV in dijet invariant-mass. These jet measurements also provide unprecedented coverage out to forward rapidities of ΙyΙ < 4.4. Next-to-leading order perturbative QCD predictions are found to be in good agreement with the measured data across 10 orders of magnitude in cross-section (figure 2).

The jet cross-section measurements have been corrected for detector effects, and the analysis exploits a greatly improved understanding of the detector performance. The dominant source of systematic uncertainty is in the calibration of the jet energy scale, which has been determined to within 2.5% for central jets with pT above 60 GeV.

A publication is currently in preparation. Work is on-going to reduce the systematic uncertainties further and the collaboration will extend the kinematic reach of these exciting high-pT jet measurements with much larger datasets in 2011–2012.

RHIC reveals heaviest antimatter

CCnew7_05_11

Members of the international STAR collaboration at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory have observed antihelium-4. This is the heaviest antinucleus detected so far, following the discovery of the first antihypernucleus (an antiproton, an antineutron and a Λ) by the same collaboration just a year ago. After sifting through 0.5 × 1012 tracks in data for 109 gold–gold collisions at centre-of-mass energies of 200 GeV and 62 GeV per nucleon–nucleon pair, the STAR collaboration found 18 events with the signature of the antihelium-4 nucleus, which is distinguished by its mass together with its charge of -2.

While the curvature of the tracks in the magnetic field of the STAR detector provide a momentum measurement, key information also comes from the mean energy-loss per unit track length, 〈dE/dx〉, in the gas of the TPC and from the time of flight of particles arriving at the time-of-flight barrel that surrounds the TPC. The 〈dE/dx〉 information helps in identification by distinguishing particles with different masses or charges, the time of flight being needed for identification at higher momenta, above 1.75 GeV/c. The figure shows the identification of isotopes based on energy loss and mass calculated from momentum in the region of helium-3 and helium-4 for both positive and negative particles, with 18 counts for antihelium-4.

The team used this observation to calculate the antimatter yield at RHIC and found that the production rate falls by a factor of 1.6 +1.0/–0.6 × 103 (1.1 +0.3/–0.2 × 103) for each additional antinucleon (nucleon). This is in line with the expectations from coalescent nucleosynthesis models, as well as from thermodynamic models.

The finding ties in with the scientific goals of the Alpha Magnetic Spectrometer launched on 16 May (AMS takes off), which will search for antimatter in space. It also nicely marks the centenary of the paper by Ernest Rutherford in which he analysed the scattering of helium nuclei (alpha particles) on gold and first established the existence of the atomic nucleus.

Italy approves long-term funding for the SuperB project

The Italian government has approved the long-term funding of the SuperB project. Mariastella Gelmini, the Italian minister for university education and research, announced on 19 April that the Interministerial Committee for Economic Programming had approved the National Research Plan 2011–2013. This sets out the future direction of 14 flagship projects, including SuperB.

The SuperB project is based on the principle that smaller particle accelerators, operating at a low energy can still give excellent scientific results complementary to the high-energy frontier. The project centres on an asymmetric electron–positron collider with a peak luminosity of 1036 cm–2 s–1. Such a high luminosity will allow the indirect exploration of new effects in the physics of heavy quarks and flavours at energy scales up to 10–100 TeV, through studies at only 10 GeV in the centre-of-mass of large samples of B, D and τ decays. At full power, SuperB should be able to produce 1000 pairs of B mesons and the same number of τ pairs, as well as several thousand D mesons every second. The design is based on ideas developed in Italy and tested by the accelerator division of the National Laboratories of INFN in Frascati using the machine called Daφne.

Sponsored by the National Institute of Nuclear Physics (INFN), Super B is to be built in Italy with international involvement. Many countries have expressed an interest in the project and physicists from Canada, France, Germany, Israel, Norway, Poland, Russia, Spain, the UK and the US are taking part in the design effort.

The Istituto Italiano di Tecnologia is co-operating with INFN on the project, which should help in the development of innovative techniques with an important impact in technology and other research areas. It will be possible to use the accelerator as a high-brilliance light source, for example. The machine will be equipped with several photon channels, allowing the extension of the scientific programme to the physics of matter and biotechnology.

Gravity Probe B confirms Einstein’s general relativity

The final results of the Gravity Probe B (GP-B) satellite confirm two key predictions of Albert Einstein’s general theory of relativity: the geodetic and frame-dragging effects. The precise determination of these two effects, first proposed 50 years ago, seals the success of this extremely challenging mission.

The history of Gravity Probe B started in 1961 with a proposal to NASA to develop a relativity gyroscope experiment. A refined “Proposal to develop a zero-G, drag-free satellite and to perform a gyro test of general relativity in a satellite” was submitted in November 1962 and funded one year later. Defining the mission was relatively simple but solving all of the technological challenges to obtain the desired precision became an odyssey.

The basic idea is to place gyroscopes and a telescope in a polar-orbiting satellite; to align both the telescope and the spin axis of each gyroscope with a distant reference point, a guide star; and to keep pointing at this star for a year while measuring the drift in the spin-axis alignment of each gyroscope. The problem is that this has to be achieved with an accuracy of 1 milliarcsecond (mas). In practice, the gyroscopes are four perfect spheres the size of a ping-pong ball with a spin rate of around 70 Hz, which are made of quartz coated with niobium. They have to be kept without any contact inside a quartz housing with an inner radius only 32 μm larger than the balls. One of the gyroscopes is even left free-floating and the entire spacecraft is moved round to keep the device in its housing. Everything has to be completely isolated magnetically and cooled to 1.8 K to achieve superconductivity in the niobium coating that is used to measure the spin axis of the gyroscopes.

On 4 May, the GP-B team proudly announced: “After 31 years of research and development, 10 years of flight preparation, a 1.5-year flight mission and 5 years of data analysis, our GP-B team has arrived at the final experimental results for this landmark test of Einstein’s 1916 general theory of relativity.” The results are values for two measurements: a geodetic drift rate of –6601.8±18.3 mas/yr and a frame-dragging drift rate of –37.2±7.2 mas/yr. Both effects are clearly detected and the values are consistent with the predictions of general relativity of –6606.1 mas/yr and –39.2 mas/yr, respectively. The final results are an error-weighted average of the drifts of the spin axes measured on the four individual gyroscopes. The geodetic effect – the deformation of space–time around the Earth – leads to a drift in the north-south direction, while the frame-dragging or Lense–Thirring effect – the entrainment of space–time by the daily rotation of the Earth – results in a west-east drift.

The eventual publication of the results from data acquired between 28 August 2004 and 14 August 2005 must be a relief for Francis Everitt, the principal investigator of GP-B and his team at Stanford University. Extracting the signal from the noise and accounting for all of the systematic effects was a difficult process. The final uncertainty on the frame-dragging effect remains relatively high (19%) and far from the prelaunch goal of achieving an accuracy of 1% (CERN Courier June 2004 p13). It does not supersede the accuracy obtained with the LAser GEOdynamics Satellites (LAGEOS) published shortly after the launch of GP-B, although the uncertainty on these measurements remains controversial (CERN Courier December 2004 p15). Perhaps the success of the GP-B mission resides more in the extraordinary technological achievements than in the actual results.

Hardware joins the open movement

CChar1_05_11

“Designing in an open environment is definitely more fun than doing it in isolation, and we firmly believe that having fun results in better hardware.” It is hard to deny that enthusiasm is inspiring and that it can be one of the factors in the success of any enterprise. The statement comes from the Manifesto of the Open Hardware Repository (OHR), which is defined by its creators as a place on the web where electronics designers can collaborate on open-hardware designs, much in the philosophy of the movement for open-source software. Of course, there is more to this than the importance of enthusiasm. Feedback from peers, design reuse and better collaboration with industry are also among the important advantages to working in an open environment.

The OHR was the initiative of electronics designers working in experimental-physics laboratories who felt the need to enable knowledge-exchange across a wide community and in line with the ideals of “open science” being fostered by organizations such as CERN. “For us, the drive towards open hardware was largely motivated by well meaning envy of our colleagues who develop Linux device-drivers,” says Javier Serrano, an engineer at CERN’s Beams Department and the founder of the OHR. “They are part of a very large community of competent designers who share their knowledge and time in order to come up with the best possible operating system. They learn a lot and have lots of fun in the process. This enables them to provide better drivers faster to our CERN clients,” he continues. “We wanted that, and found out that there was no intrinsic reason why hardware development should be any different. After all, we all work with computers and the products of our efforts are also binary files, which later become pieces of hardware.”

One of the main factors leading to the creation of the OHR was the wish to avoid duplication by simply sharing results across different teams that might be working simultaneously towards the solution of the same problem. Sharing the achievements of each researcher in the repository also results in an improved quality of work. “Sharing design effort with other people has forced us to be better in a number of areas,” states Serrano. “You can’t share without a proper preliminary specification-phase and good documentation. You also can’t share if you design a monolithic solution rather than a modular one from which you and others can pick bits and pieces to use in other projects. The first time somebody comes and takes a critical look at your project it feels a bit awkward, but then you realize how much great talent there is out there and how these people can help, especially in areas that are not your main domain of competence.”

Under licence

CChar2_05_11

Two years after its creation, the OHR currently hosts more than 40 projects from institutes that include CERN, GSI and the University of Cape Town. Such a wealth of knowledge in electronics design can now be shared under the newly published CERN Open Hardware Licence (OHL), which was released in March and is available on the OHR. “In the spirit of knowledge sharing and dissemination, this licence governs the use, copying, modification and distribution of hardware design documentation, and the manufacture and distribution of products,” explains Myriam Ayass, legal adviser of the Knowledge and Technology Transfer Group at CERN and author of the CERN OHL. The documentation that the OHL refers to includes schematic diagrams, designs, circuit or circuit-board layouts, mechanical drawings, flow charts and descriptive texts, as well as other explanatory material. The documentation can be in any medium, including – but not limited to – computer files and representations on paper, film, or other media.

The introduction of the CERN OHL is indeed a novelty in which the long-standing practice of sharing hardware design has adopted a clear policy for the management of intellectual property. “The CERN–OHL is to hardware what the General Public Licence is to software. It defines the conditions under which a licensee will be able to use or modify the licensed material,” explains Ayass. “The concept of ‘open-source hardware’ or ‘open hardware’ is not yet as well known or widespread as the free software or open-source software concept,” she continues. “However, it shares the same principles: anyone should be able to see the source (the design documentation in case of hardware), study it, modify it and share it. In addition, if modifications are made and distributed, it must be under the same licence conditions – this is the ‘persistent’ nature of the licence, which ensures that the whole community will continue benefiting from improvements, in the sense that everyone will in turn be able to make modifications to these improvements.”

CChar3_05_11

Despite these similarities, the application of “openness” in the two domains – software and hardware – differs substantially because of the nature of the “products”. “In the case of hardware, physical resources must be committed for the creation of physical devices,” Ayass points out. “The CERN OHL thus specifically states that manufacturers of such products should not imply any kind of endorsement or responsibility on the part of the designer(s) when producing and/or selling hardware based on the design documents. This is important in terms of legal risks associated with engaging in open-source hardware, and properly regulating this is a prerequisite for many of those involved.”

The OHR also aims to promote a new business model in which companies can play a variety of roles, design open hardware in collaboration with other designers or clients and get paid for that work. As Serrano explains: “Companies can also commercialize the resulting designs, either on their own or as part of larger systems. Customers, on their side, can debug designs and improve them very efficiently, ultimately benefiting not only their own systems but also the companies and other clients.”

“The fact that the designs are ‘open’ also means that anyone can manufacture the product based on this design – from individuals to research institutes to big companies – and commercialize it. This is one approach of technology transfer that nicely combines dissemination of the technology and of the accompanying knowledge,” adds Ayass. This combining of an innovative business model and the OHL is finding a positive response in the commercial world. “We are very excited because we are proving that there is no contradiction between commercial hardware and openness,” says Serrano, who concludes: “The CERN OHL will be a great tool for us to collaborate with other institutes and companies.”

• For more about the OHR see www.ohwr.org. For more about the CERN OHL, see www.ohwr.org/cernohl.

ALICE enters new territory in heavy-ion collisions

CCali1_05_11

The goal of ALICE (A Large Ion Collider Experiment) is to measure the properties of strongly interacting matter generated in heavy-ion collisions at the LHC at CERN. On 7 November 2010, the LHC became the world’s most energetic heavy-ion accelerator when lead nuclei collided at a centre-of-mass energy √(sNN) = 2.76 TeV per colliding nucleon pair. This is an energy more than 10 times higher than that of the previous record holder, the Relativistic Heavy-Ion Collider (RHIC) at Brookhaven National Laboratory in New York.

Quantum chromodynamics (QCD), the theory of strong interactions, predicts that at a temperature of about 170 MeV (2 × 1012 K), nuclear matter undergoes a phase transition from its normal hadronic state to a deconfined partonic phase, the quark-gluon plasma (QGP). This is about 100,000 times hotter than the core of the Sun, and such extreme conditions occur only under special circumstances. One such circumstance is the early universe, where the QGP filled all space a few microseconds after the Big Bang; another is the head-on collision of heavy ions at the LHC and RHIC, where a QGP may be created for a fleeting instant.

CCali2_05_11

RHIC has now been running for a decade. One of its spectacular findings was that the matter generated in heavy-ion collisions flows like a liquid with very low internal resistance to flow, almost at the limit of what is allowed for any material in nature. This tells us that the constituents of this matter are quite different from freely interacting quarks and gluons. This almost-perfect fluid has been found to be opaque to even the most energetic partons (quarks and gluons), which appear as “jets” of particles from the collisions – an effect known as jet quenching. The physical mechanisms underlying these phenomena are not well understood. One of the first tasks of heavy-ion studies at the LHC is to “rediscover” these effects and probe them further with new tools as the basis for a much broader and deeper study of the QGP in the coming years. So what have we learnt at the LHC from heavy ions so far?

“Calibrating” at the LHC

To explore the features of hot QCD matter we have to calibrate our tools. Interpretation of the complex interaction of heavy-ions relies on theoretical modelling, beginning with the initial conditions of the hot system – the fireball – at the instant after the collision. One of the crucial inputs for calibrating the models is the distribution of the multiplicity (total number) of particles produced in a collision. This tells us a great deal about how the quarks and gluons in the incoming nuclei transform into the particles (pions, kaons, and so on) observed in the detector.

CCali3_05_11

The number of generated particles is correlated with the impact parameter of the collision; that is, the distance between centres of the colliding nuclei. Small impact parameters, in which the colliding nuclei hit each other nearly head-on so that the largest number of incoming protons and neutrons “participate” in the collision, generate the most particles. Thus, ordering the ensemble of measured collisions according to their multiplicity allows them to be sorted into different classes of impact parameter. The number of created particles can also tell us about the energy density reached within the collisions and the temperature of the fireball.

Multiplicity measurements by the ALICE experiment show that the system created at the LHC initially has much higher energy density and is at least 30% hotter than at RHIC, resulting in about double the particle multiplicity for each colliding nucleon pair (Aamodt et al. 2010a). Figure 1 shows the energy dependence of particle production with the new measurement obtained at the LHC.

Perhaps surprisingly, despite their vastly different collision energies, the growth in particle multiplicity from RHIC to the LHC is similar at all impact parameters, as figure 2 shows (Aamodt et al. 2011). These measurements by ALICE also show that various predictions driven either by phenomenological extrapolation from the lower energies or by colour-charge density-saturation models are inadequate at the LHC.

A perfect liquid at the LHC?

Off-centre nuclear collisions, with a finite impact parameter, create a strongly asymmetric “almond-shaped” fireball. However, experiments cannot measure the spatial dimensions of the interaction (except in special cases, for example in the production of pions). Instead, they measure the momentum distributions of the emitted particles. A correlation between the measured azimuthal momentum distribution of particles emitted from the decaying fireball and the initial spatial asymmetry can arise only from multiple interactions between the constituents of the created matter; in other words it tells us about how the matter flows, which is related to its equation of state and its thermodynamic transport properties.

CCali4_05_11

The measured azimuthal distribution of particles in momentum space can be decomposed into Fourier coefficients. The second Fourier coefficient (v2), called elliptic flow, is particularly sensitive to the internal friction or viscosity of the fluid, or more precisely, η/s, the ratio of the shear viscosity (η) to entropy (s) of the system. For a good fluid such as water, the η/s ratio is small. A “thick” liquid, such as honey, has large values of η/s. Comparison of the elliptic flow measured in heavy-ion collisions at RHIC with theoretical models suggests that the hot matter created in the collision flows like a fluid with little friction, with η/s close to its lower limit – the theoretical limit for a perfect fluid limit – given by η/s = ħ/4πkB, where ħ is Planck’s constant and kB is the Boltzmann constant.

In heavy-ion collisions at the LHC, the ALICE collaboration found that the elliptic flow of charged particles increases by about 30% compared with flow measured at the highest energy at RHIC of 0.2 TeV (figure 3). However, hydrodynamic calculations tuned to reproduce the results at RHIC – when recalibrated to the LHC energy regime – reproduce the new measurements well. The hot and dense matter at the LHC also behaves like a fluid with almost zero viscosity. With these measurements, ALICE has just begun to explore the temperature dependence of η/s and we anticipate many more in-depth flow-related measurements at the LHC that will constrain the hydrodynamic features of the QGP even further.

Partonic energy loss

CCali5_05_11

A basic process in QCD is the energy loss of a fast parton in a medium composed of colour charges. This phenomenon, “jet quenching”, is especially useful in the study of the QGP, using the naturally occurring products (jets) of the hard scattering of quarks and gluons from the incoming nuclei. A highly energetic parton (a colour charge) probes the coloured medium rather like an X-ray probes ordinary matter. The production of these partonic probes in hadronic collisions is well understood within perturbative QCD. The theory also shows that a parton traversing the medium will lose a fraction of its energy in emitting many soft (low energy) gluons. The amount of the radiated energy is proportional to the density of the medium and to the square of the path length travelled by the parton in the medium. Theory also predicts that the energy loss depends on the flavour of the parton.

Jet quenching was first observed at RHIC by measuring the yields of hadrons with high transverse momentum (pT). These particles are produced via fragmentation of energetic partons. The yields of these high-pT particles in central nucleus–nucleus collisions were found to be a factor of five lower than expected from the measurements in proton–proton reactions. ALICE has recently published the measurement of charged particles in central heavy-ion collisions at the LHC. As at RHIC, the production of high-pT hadrons at the LHC is strongly suppressed. However, the observations at the LHC show qualitatively new features (see box). The observation from ALICE is consistent with reports from the ATLAS and CMS collaborations on direct evidence for parton energy loss within heavy-ion collisions using fully reconstructed back-to-back jets of particles associated with hard parton scatterings. The latter two experiments have shown a strong energy imbalance between the jet and its recoiling partner (G Aad et al. 2010 and CMS collaboration 2011). This imbalance is thought to arise because one of the jets traversed the hot and dense matter, transferring a substantial fraction of its energy to the medium in a way that is not recovered by the reconstruction of the jets.

CCali7_05_11

With the first findings on hydrodynamic features of the medium created at the LHC and its opaqueness to energetic partons, the LHC has, to a large extent, reproduced measurements at RHIC. The measurements at the LHC will, however, profit from the denser medium and its longer lifetime. The vast kinematic reach provided by the higher-energy collision system enables qualitatively new measurements of the QGP.

On 23–28 May, Quark Matter, a key conference in heavy-ion physics, takes place in Annecy. The most recent experimental results and theoretical state-of-the-art concepts and calculations will be presented, targeted at the detailed understanding of QGP at RHIC and at the LHC. The ALICE collaboration will report on the observations discussed here and will also present new, in-depth studies of the elliptic flow with respect to the type of particle and its mass. Also, the first studies addressing the interplay between collective features of the medium and jet production at the LHC will be shown. Moreover, ALICE will present its first insight into the energy loss of heavy flavour (charm and bottom quarks) in the hot QCD medium. In the coming years, all of these crucial measurements will help to uncover the key properties of the QGP at the LHC.

bright-rec iop pub iop-science physcis connect