Comsol -leaderboard other pages

Topics

The high-energy frontier (archive)

The principal goal of the experimental programme at the LHC is to make the first direct exploration of a completely new region of energies and distances, to the tera-electron-volt scale and beyond. The main objectives include the search for the Higgs boson and whatever new physics may accompany it, such as supersymmetry or extra dimensions, and also – perhaps above all – to find something that the theorists have not predicted.

The Standard Model of particles and forces summarizes our present knowledge of particle physics. It extends and generalizes the quantum theory of electromagnetism to include the weak nuclear forces responsible for radioactivity in a single unified framework; it also provides an equally successful analogous theory of the strong nuclear forces.

CCPhy1_10_08

The conceptual basis for the Standard Model was confirmed by the discovery at CERN of the predicted weak neutral-current form of radioactivity and, subsequently, of the quantum particles responsible for the weak and strong forces, at CERN and DESY respectively. Detailed calculations of the properties of these particles, confirmed in particular by experiments at the LEP collider, have since enabled us to establish the complete structure of the Standard Model; data taken at LEP agreed with the calculations at the per mille level.

These successes raise deeper problems, however. The Standard Model does not explain the origin of mass, nor why some particles are very heavy while others have no mass at all; it does not explain why there are so many different types of matter particles in the universe; and it does not offer a unified description of all the fundamental forces. Indeed, the deepest problem in fundamental physics may be how to extend the successes of quantum physics to the force of gravity. It is the search for solutions to these problems that define the current objectives of particle physics – and the programme for the LHC.

Higgs, hierarchy and extra dimensions

Understanding the origin of mass will unlock some of the basic mysteries of the universe: the mass of the electron determines the sizes of atoms, while radioactivity is weak because the W boson weighs as much as a medium-sized nucleus. Within the Standard Model the key to mass lies with an essential ingredient that has not yet been observed, the Higgs boson; without it the calculations would yield incomprehensible infinite results. The agreement of the data with the calculations implies not only that the Higgs boson (or something equivalent) must exist, but also suggests that its mass should be well within the reach of the LHC.

Experiments at LEP at one time found a hint for the existence of the Higgs boson, but these searches proved unsuccessful and told us only that it must weigh at least 114 GeV. At the LHC, the ATLAS and CMS experiments will be looking for the Higgs boson in several ways. The particle is predicted to be unstable, decaying for example to photons, bottom quarks, tau leptons, W or Z bosons (figure 1). It may well be necessary to combine several different decay modes to uncover a convincing signal, but the LHC experiments should be able to find the Higgs boson even if it weighs as much as 1 TeV.

CCPhy2_10_08

While resolving the Higgs question will set the seal on the Standard Model, there are plenty of reasons to expect other, related new physics, within reach of experiments at the LHC. In particular, the elementary Higgs boson of the Standard Model seems unlikely to exist in isolation. Specifically, difficulties arise in calculating quantum corrections to the mass of the Higgs boson. Not only are these corrections infinite in the Standard Model, but, if the usual procedure is adopted of controlling them by cutting the theory off at some high energy or short distance, the net result depends on the square of the cut-off scale. This implies that, if the Standard Model is embedded in some more complete theory that kicks in at high energy, the mass of the Higgs boson would be very sensitive to the details of this high-energy theory. This would make it difficult to understand why the Higgs boson has a (relatively) low mass and, by extension, why the scale of the weak interactions is so much smaller than that of grand unification, say, or quantum gravity.

This is known as the “hierarchy problem”. One could try to resolve it simply by postulating that the underlying parameters of the theory are tuned very finely, so that the net value of the Higgs boson mass after adding in the quantum corrections is small, owing to some suitable cancellation. However, it would be more satisfactory either to abolish the extreme sensitivity to the quantum corrections, or to cancel them in some systematic manner.

One way to achieve this would be if the Higgs boson is composite and so has a finite size, which would cut the quantum corrections off at a relatively low energy scale. In this case, the LHC might uncover a cornucopia of other new composite particles with masses around this cut-off scale, near 1 TeV.

CCPhy3_10_08

The alternative, more elegant, and in my opinion more plausible, solution is to cancel the quantum corrections systematically, which is where supersymmetry could come in. Supersymmetry would pair up fermions, such as the quarks and leptons, with bosons, such as the photon, gluon, W and Z, or even the Higgs boson itself. In a supersymmetric theory, the quantum corrections due to the pairs of virtual fermions and bosons cancel each other systematically, and a low-mass Higgs boson no longer appears unnatural. Indeed, supersymmetry predicts a mass for the Higgs boson probably below 130 GeV, in line with the global fit to precision electroweak data.

The fermions and bosons of the Standard Model, however, do not pair up with each other in a neat supersymmetric manner. The theory, therefore, requires that a supersymmetric partner, or sparticle, as yet unseen, accompanies each of the Standard Model particles. Thus, this scenario predicts a “scornucopia” of new particles that should weigh less than about 1 TeV and could be produced by the LHC (figure 3).

Another attraction of supersymmetry is that it facilitates the unification of the fundamental forces. Extrapolating the strengths of the strong, weak and electromagnetic interactions measured at low energies does not give a common value at any energy, in the absence of supersymmetry. However, there would be a common value, at an energy around 1016 GeV, in the presence of supersymmetry. Moreover, supersymmetry provides a natural candidate, in the form of the lightest supersymmetric particle (LSP), for the cold dark matter required by astrophysicists and cosmologists to explain the amount of matter in the universe and the formation of structures within it, such as galaxies. In this case, the LSP should have neither strong nor electromagnetic interactions, since otherwise it would bind to conventional matter and be detectable. Data from LEP and direct searches have already excluded sneutrinos as LSPs. Nowadays, the “scandidates” most considered are the lightest neutralino and (to a lesser extent) the gravitino.

Assuming that the LSP is the lightest neutralino, the parameter space of the constrained minimal supersymmetric extension of the Standard Model (CMSSM) is restricted by the need to avoid the stau being the LSP, by the measurements of b → sγ decay that agree with the Standard Model, by the range of cold dark-matter density allowed by astrophysical observations, and by the measurement of the anomalous magnetic moment of the muon (gμ–2). These requirements are consistent with relatively large masses for the lightest and next-to-lightest visible supersymmetric particles, as figure 4 indicates. The figure also shows that the LHC can detect most of the models that provide cosmological dark matter (though this is not guaranteed), whereas the astrophysical dark matter itself may be detectable directly for only a smaller fraction of models.

Within the overall range allowed by the experimental constraints, are there any hints at what the supersymmetric mass scale might be? The high precision measurements of mW tend to favour a relatively small mass scale for sparticles. On the other hand, the rate for b → sγ shows no evidence for light sparticles, and the experimental upper limit on Bs → μ+μ begins to exclude very small masses. The strongest indication for new low-energy physics, for which supersymmetry is just one possibility, is offered by gμ–2. Putting this together with the other precision observables gives a preference for light sparticles.

Other proposals for additional new physics postulate the existence of new dimensions of space, which might also help to deal with the hierarchy problem. Clearly, space is three-dimensional on the distance scales that we know so far, but the suggestion is that there might be additional dimensions curled up so small as to be invisible. This idea, which dates back to the work of Theodor Kaluza and Oskar Klein in the 1920s, has gained currency in recent years with the realization that string theory predicts the existence of extra dimensions and that some of these might be large enough to have consequences observable at the LHC. One possibility that has emerged is that gravity might become strong when these extra dimensions appear, possibly at energies close to 1 TeV. In this case, some variants of string theory predict that microscopic black holes might be produced in the LHC collisions. These would decay rapidly via Hawking radiation, but measurements of this radiation would offer a unique window onto the mysteries of quantum gravity.

If the extra dimensions are curled up on a sufficiently large scale, ATLAS and CMS might be able to see Kaluza–Klein excitations of Standard Model particles, or even the graviton. Indeed, the spectroscopy of some extra-dimensional theories might be as rich as that of supersymmetry while, in some theories, the lightest Kaluza–Klein particle might be stable, rather like the LSP in supersymmetric models.

Back to the beginning

By colliding particles at very high energies we can recreate the conditions that existed a fraction of a second after the Big Bang, which allows us to probe the origins of matter. Experiments at LEP revealed that there are just three “families” of elementary particles: one that makes up normal stable matter, and two heavier unstable families that were revealed in cosmic rays and accelerator experiments. The Standard Model does not explain why there are three and only three families, but it may be that their existence in the early universe was necessary for matter to emerge from the Big Bang, with little or no antimatter.

CCPhy4_10_08

Andrei Sakharov was the first to point out that particle physics could explain the origin of matter in the universe by the fact that matter and antimatter have slightly different properties, as discovered in the decays of K and B mesons, which contain strange and bottom quarks, members of the heavier families. These differences are manifest in the phenomenon of CP violation. Present data are in good agreement with the amount of CP violation allowed by the Standard Model, but this would be insufficient to generate the matter seen in the universe.

The Standard Model accounts for CP violation within the context of the Cabibbo–Kobayashi–Maskawa (CKM) matrix, which links the interactions between quarks of different type (or flavour). Experiments at the B-factories at KEK and SLAC have established that the CKM mechanism is dominant, so the question is no longer whether this is “right”. The task is rather to look for additional sources of CP violation that must surely exist, to create the cosmological matter–antimatter asymmetry via baryogenesis in the early universe. If the LHC does observe any new physics, such as the Higgs boson and/or supersymmetry, it will become urgent to understand its flavour and CP properties.

The LHCb experiment will be dedicated to probing the differences between matter and antimatter, notably looking for discrepancies with the Standard Model. The experiment has unique capabilities for probing the decays of mesons containing both bottom and strange quarks. It will be able to measure subtle CP-violating effects in Bs decays, and will also improve measurements of all the angles of the unitarity triangle, which expresses the amount of CP violation in the Standard Model. The LHC will also provide high sensitivity to rare B decays, to which the ATLAS and CMS experiments will contribute, in particular, and which may open another window on CP violation beyond the CKM model.

In addition to the studies of proton–proton collisions, heavy-ion collisions at the LHC will provide a window onto the state of matter that would have existed in the early universe at times before quarks and gluons “condensed” into hadrons, and ultimately the protons and neutrons of the primordial elements. When heavy ions collide at high energies they form for an instant a “fireball” of hot, dense matter. Studies, in particular by the ALICE experiment, may resolve some of the puzzles posed by the data already obtained at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven. These data indicate that there is very rapid thermalization in the collisions, after which a fluid with very low viscosity and large transport coefficients seems to be produced. One of the surprises is that the medium produced at RHIC seems to be strongly interacting . The final state exhibits jet quenching and the semblance of cones of energy deposition akin to Machian shock waves or Cherenkov radiation patterns, indicative of very fast particles moving through a medium faster than sound or light.

Experiments at the LHC will enter a new range of temperatures and pressures, thought to be far into the quark–gluon plasma regime, which should test the various ideas developed to explain results from RHIC. The experiments will probably not see a real phase transition between the hadronic and quark–gluon descriptions; it is more likely to be a cross-over that may not have a distinctive experimental signature at high energies. However, it may well be possible to see quark–gluon matter in its weakly interacting high temperature phase. The larger kinematic range should also enable ideas about jet quenching and radiation cones to be tested.

First expectations

The first step for the experimenters will be to understand the minimum-bias events and compare measurements of jets with the predictions of QCD. The next Standard Model processes to be measured and understood will be those producing the W- and Z-vector bosons, followed by top-quark physics. Each of these steps will allow the experimental teams to understand and calibrate their detectors, and only after these steps will the search for the Higgs boson start in earnest. The Higgs will not jump out in the same way as did the W and Z bosons, or even the top quark, and the search for it will demand an excellent understanding of the detectors. Around the time that Higgs searches get underway, the first searches for supersymmetry or other new physics beyond the Standard Model will also start.

In practice, the teams will look for generic signatures of new physics that could be due to several different scenarios. For example, missing-energy events could be due to supersymmetry, extra dimensions, black holes or the radiation of gravitons into extra dimensions. The challenge will then be to distinguish between the different scenarios. For example, in the case of distinguishing between supersymmetry and universal extra dimensions, the spectra of higher excitations would be different in the two scenarios, the different spins of particles in cascade decays would yield distinctive spin correlations, and the spectra and asymmetries of, for instance, dileptons, would be distinguishable.

CCPhy5_10_08

What is the discovery potential of this initial period of LHC running? Figure 5a shows that a Standard Model Higgs boson could be discovered with 5 σ significance with 5 fb–1 of integrated and well-understood luminosity, whereas 1 fb–1 would already suffice to exclude a Standard Model Higgs boson at the 95% confidence level over a large range of possible masses. However, as mentioned above, this Higgs signal would receive contributions from many different decay signatures, so the search for the Higgs boson will require researchers to understand the detectors very well to find each of these signatures with good efficiency and low background. Therefore, announcement of the Higgs discovery may not come the day after the accelerator produces the required integrated luminosity!

Paradoxically, some new physics scenarios such as supersymmetry may be easier to spot, if their mass scale is not too high. For example, figure 5b shows that 0.1 fb–1 of luminosity should be enough to detect the gluino at the 5 σ level if its mass is less than 1.2 TeV, and to exclude its existence below 1.5 TeV at the 95% confidence level. This amount of integrated luminosity could be gathered with an ideal month’s running at 1% of the design instantaneous luminosity.

We do not know which, if any, of the theories that I have mentioned nature has chosen, but one thing is sure: once the LHC starts delivering data, our hazy view of this new energy scale will begin to clear dramatically.

Based on the concluding talk at Physics at the LHC, Cracow, 3–8 July 2006 (http://arxiv.org/abs/hep-ph/0611237).

LHC computing: Milestones (archive)

The Grid gets EU funds

Plans for the next generation of network-based information-handling systems took a major step forward when the European Union’s Fifth Framework Information Society Technologies programme concluded negotiations to fund the Data Grid research and development project. The project was submitted to the EU by a consortium of 21 bodies involved in a variety of sciences, from high-energy physics to Earth observation and biology, as well as computer sciences and industry. CERN is the leading and coordinating partner in the project.

Starting from this year, the Data Grid project will receive in excess of €9.8 million for three years to develop middleware (software) to deploy applications on widely distributed computing systems. In addition to receiving EU support, the enterprise is being substantially underwritten by funding agencies from a number of CERN’s member states. Due to the large volume of data that it will produce, CERN’s LHC collider will be an important component of the Data Grid.

As far as CERN is concerned, this programme of work will integrate well into the computing testbed activity that is already planned for the LHC. Indeed, the model for the distributed computing architecture that Data Grid will implement is largely based on the results of the MONARC (Models of Networked Analysis at Regional Centres for LHC experiments) project.

The work that the project will involve has been divided into numbered subsections, or “work packages” (WP). CERN’s main contribution will be to three of these work packages: WP 2, dedicated to data management and data replication; WP 4, which will look at computing-fabric management; and WP 8, which will deal with high-energy physics applications. Most of the resources for WP 8 will come from the four major LHC experimental collaborations: ATLAS, CMS, ALICE and LHCb.

Other work will cover areas such as workload management (coordinated by the INFN in Italy), monitoring and mass storage (coordinated in the UK by the PPARC funding authority and the UK Rutherford Appleton Laboratory) and testbed and networking (coordinated in France by IN2P3 and the CNRS).

March 2001 p5 (abridged).

 

The Gigabyte System Network

To mark the major international Telecom ’99 exhibition in Geneva, CERN staged a demonstration of the world’s fastest computer-networking standard, the Gigabyte System Network. This is a new networking standard developed by the High-Performance Networking Forum, which is a worldwide collaboration between industry and academia. Telecom ’99 delegates came to CERN to see the new standard in action.

GSN is the first networking standard capable of handling the enormous data rates expected from the LHC experiments. It has a capacity of 800 Mbyte/s (that’s getting on for a full-length feature film), making it attractive beyond the realms of scientific research. Internet service providers, for example, expect to require these data rates to supply high-quality multimedia across the Internet within a few years. Today, however, most home network users have to be content with 5 kbyte/s, or about a single frame. Even CERN, one of Europe’s largest networking centres, currently has a total external capacity of only 22 Mbyte/s.

November 1999 p10 (abridged).

 

Approval for Grid project for LHC computing

The first phase of the impressive Computing Grid project for CERN’s LHC was approved at a special meeting of CERN’s Council, its governing body, on 20 September.

CCCom2_10_08

October 2001 p32 (extract).

After LHC commissioning, the collider’s four giant detectors will be accumulating more than 10 million Gbytes of particle-collision data each year (equivalent to the contents of about 20 million CD-ROMs). To handle this will require a thousand times the computing power available at CERN today.

Nearly 10 000 scientists, at hundreds of universities round the world, will group in virtual communities to analyse this LHC data. The strategy relies on the coordinated deployment of communications technologies at hundreds of institutes via an intricately interconnected worldwide grid of tens of thousands of computers and storage devices.

The LHC Computing Grid project will proceed in two phases. Phase 1, to be activated in 2002 and continuing in 2003 and 2004, will develop the prototype equipment and techniques necessary for the data-intensive scientific computing of the LHC era. In 2005, 2006 and 2007, Phase 2 of the project, which will build on the experience gained in the first phase, will construct the production version of the LHC Computing Grid.

Phase 1 will require an investment at CERN of SFr30_million (some €20 million) which will come from contributions from CERN’s member states and major involvement of industrial sponsors. More than 50 positions for young professionals will be created. Significant investments are also being made by participants in the LHC programme, particularly in the US and Japan, as well as Europe.

November 2001 p5 (abridged).

LHC computing: Switching on to the Grid (archive)

CCCom1_10_08

When CERN’s LHC collider begins operation, it will be the most powerful machine of its type in the world, providing research facilities for thousands of researchers from all over the globe.

The computing capacity required for analysing the data generated by these big LHC experiments will be several orders of magnitude greater than that used by current experiments at CERN, itself already substantial. Satisfying this vast data-processing appetite will require the integrated use of computing facilities installed at several research centres across Europe, the US and Asia.

During the last two years the Models of Networked Analysis at Regional Centres for LHC Experiments (MONARC) project, supported by a number of institutes participating in the LHC programme, has been developing and evaluating models for LHC computing. MONARC has also developed tools for simulating the behaviour of such models when implemented in a wide-area distributed computing environment. This requirement arrived on the scene at the same time as a growing awareness that major new projects in science and technology need matching computer support and access to resources worldwide.

In the 1970s and 1980s the Internet grew up as a network of computer networks, each established to service specific communities and each with a heavy commitment to data processing.

In the late 1980s the World Wide Web was invented at CERN to enable particle physicists scattered all over the globe to access information and participate actively in their research projects directly from their home institutes. The amazing synergy of the Internet, the boom in personal computing and the growth of the Web grips the whole world in today’s dot.com lifestyle.

Internet, Web, what next?

However, the Web is not the end of the line. New thinking for the millennium, summarized in a milestone book entitled The Grid by Ian Foster of Argonne and Carl Kesselman of the Information Sciences Institute of the University of Southern California, aims to develop new software (“middleware”) to handle computations spanning widely distributed computational and information resources – from supercomputers to individual PCs.

Just as a grid for electric power supply brings watts to the wallplug in a way that is completely transparent to the end user, so the new data Grid will do the same for information.

Each of the major LHC experiments – ATLAS, CMS and ALICE – is estimated to require computer power equivalent to 40,000 of today’s PCs. Adding LHCb to the equation gives a total equivalent of 140,000 PCs, and this is only for day 1 of the LHC.

Within about a year this demand will have grown by 30%. The demand for data storage is equally impressive, calling for some several thousand terabytes – more information than is contained in the combined telephone directories for the populations of millions of planets. With users across the globe, this represents a new challenge in distributed computing. For the LHC, each experiment will have its own central computer and data storage facilities at CERN, but these have to be integrated with regional computing centres accessed by the researchers from their home institutes.

CERN serves as Grid testbed

As a milestone en route to this panorama, an interim solution is being developed, with a central facility at CERN complemented by five or six regional centres and several smaller ones, so that computing can ultimately be carried out on a cluster in the user’s research department. To see whether this proposed model is on the right track, a testbed is to be implemented using realistic data.

Several nations have launched new Grid-oriented initiatives – in the US by NASA and the National Science Foundation, while in Europe particle physics provides a natural focus for work in, among others, the UK, France, Italy and Holland. Other areas of science, such as Earth observation and bioinformatics, are also on board. In Europe, European Commission funding is being sought to underwrite this major effort to propel computing into a new orbit.

June 2000 pp17–18.

TOTEM and LHCf: Roman pots for the LHC (archive)

The “Roman pot” technique has become a time-honoured particle-physics approach each time a new energy frontier is opened up, and CERN’s LHC proton collider, which can attain collision energies of 14 TeV, will be no exception. While other detectors look for spectacular head-on collisions, where fragments fly out at wide angles to the direction of the colliding beam, with Roman pots the intention is to get as close as possible to the beams and to intercept particles that have been only slightly deflected.

If two flocks of birds fly into each other, most of the birds usually miss a head-on collision. Likewise, when two counter-rotating beams of particles meet, most of the particles are only slightly deflected, if at all. Paradoxically, most of the particles in a collider do not collide. Of those particles that do, many of them just graze past each other, emerging very close to the particles that are sailing straight through.

CCTOT3_10_08

These forward particles are also important for measuring the total collision rate (cross-section). In the same way as light diffracting around a small obstacle gives a bright spot in the centre of the geometric shadow, so the wave nature of particles gives a central spot of maximum “brightness”.

To pick up these forward particles means having detectors that venture as near to the path of the colliding beams as possible, like avid spectators at a motor race leaning over the safety barrier. This is where Roman pots come in.

Why Roman? They were first used by a CERN/Rome group in the early 1970s to study the physics at CERN’s Intersecting Storage Rings (ISR), the world’s first high-energy proton–proton collider.

Why pots? The delicate detectors, able to localize the trajectory of subnuclear particles to within 0.1 mm, are housed in a cylindrical vessel. These “pots” are connected to the vacuum chamber of the collider by bellows, which are compressed as the pots are pushed towards the particles circulating inside the vacuum chamber.

The physics debut of these Roman pots was a physics milestone. Experiments at lower energies had found that the proton interaction rate was shrinking, and physicists feared that the proton might shrink out of sight at higher energies. Using the Roman pots, the first experiments at the ISR were able to establish rapidly that the interaction rate of protons (total cross-section) in fact increases at the new energies probed by the ISR.

In their retracted position, the Roman pots do not obstruct the beam, thus leaving the full aperture of the vacuum chamber free for the fat beams encountered during the injection process. Once the collider reaches its coasting energy, the Roman pot is edged inwards until its rim is just 1 mm from the beam, without upsetting the stability of the circulating particles.

Each time a new energy regime is reached in a particle collider, Roman pots are one of the first detectors on the scene, gauging the cross-section at the new energy range. After the ISR, Roman pots have been used at CERN’s proton–antiproton collider, Fermilab’s Tevatron proton–antiproton collider and the HERA electron–proton collider at the DESY laboratory, Hamburg.

In the future, Roman pots will again have their day in the TOTEM experiment at CERN’s LHC proton collider.

April 1999 p8.

LHCf: a tiny new experiment joins the LHC

While most of the LHC experiments are on a grand scale, LHC forward (LHCf) is quite different. Unlike the massive detectors that are used by ATLAS or CMS, LHCf’s largest detector is a mere 30 cm. Rather like the TOTEM detector, this experiment focuses on forward physics at the LHC. The aim of LHCf is to compare data from the LHC with various shower models that are widely used to estimate the primary energy of ultra-high-energy cosmic rays.

The LHCf detectors will be placed on either side of the LHC, 140 m from the ATLAS interaction point. This location will allow the observation of particles at nearly zero degrees to the proton beam direction. The detectors comprise two towers of sampling calorimeters designed by Katsuaki Kasahara from the Shibaura Institute of Technology. Each is made of tungsten plates and plastic scintillators 3 mm thick for sampling.

Yasushi Muraki from Nagoya University leads the LHCf collaboration, with 22 members from 10 institutions and four countries. For many of the collaborators this is a reunion, as they had worked on the former Super Proton Synchrotron experiment UA7.

November 2006 p8.

TOTEM goes the distance (archive)

CCTOT2_10_08

With detectors positioned at distances of 147 and 220 m from the CMS interaction point and others inside CMS, the TOTal Elastic and Diffractive Cross Section Measurement (TOTEM) experiment will measure the total interaction cross-section of protons at the LHC.

The data collected by the experiment will help to improve knowledge of the internal structure of the proton and the principles that determine the shape and form of protons as a function of their energy. Furthermore, TOTEM will allow precise measurements of the LHC luminosity and individual cross-sections used by the other LHC experiments. Specific to the TOTEM experiment are the “Roman pots”. Veritable marvels of technology, these cylindrical vessels can be moved to within 1 mm of the beam centre. They contain detectors that will measure very forward protons, only a few microradians away from the beams, which arise from elastic scattering and diffractive processes.

Inelastic interactions between protons will be studied by gas electron multiplier (GEM) detectors installed in “telescopes”, placed in the forward region of the CMS detector, where the charged-particle densities are estimated to be in the region of 106 cm–2s–1. Each of the telescopes contains 20 half-moon detectors arranged in 10 planes, with an inner radius matching the beam pipe. TOTEM will exploit the full decoupling of the charge-amplification and charge-collection regions, which allows freedom in the optimization of the readout structure, a unique property of GEM detectors.

The closer that the Roman pot detectors can get to the path of the beam, the more precise the results. For the LHC, the Roman pots will collect data from a distance of 800 μm from the beam. Several improvements in TOTEM’s detectors will provide an unprecedented level of precision: the thin stainless-steel windows of less than 150 μm in thickness; the flatness of the windows (less than 30 μm); and the precision of the motor mechanism that moves the pots towards the beam. The pots used in the TOTEM experiment are manufactured by VakuumPraha in Prague, according to specification drawings produced at CERN.

In the final configuration, eight Roman pots will be placed in pairs at four locations at Point 5 on the LHC. There are two stations at each end of the CMS detector, positioned at distances of 147 m and 220 m from the collision point (interaction point 5). Although TOTEM and CMS are scientifically independent experiments, the Roman-pot technique will complement the results obtained by the CMS detector and by the other LHC experiments overall. The ATLAS experiment will also be using a pair of Roman pots based on the design developed by TOTEM, with slight adaptations to suit its own specific needs.

TOTEM has now installed all the Roman pots and has equipped a few of them with detectors. This will allow them to test the movement of the Roman pots with respect to the beams at the LHC start-up and to take some first data. Some detectors were also installed within CMS. After having gained experience this year, the remaining detectors will be installed during the winter shut-down to make the experiment fully operational for next year’s runs.

• Based on an article in CERN Bulletin 2008 issue 37–38.

LHCf looks forward to high energies

CCTOT1_10_08

Positioned 140 m from the ATLAS interaction point, the LHCf experiment will attempt to improve the models that describe the disintegration of ultra-high-energy cosmic rays as they enter the atmosphere. This will allow their energies to be determined more accurately and their composition to be analysed with greater precision. This information will help support the hypotheses on the mysterious origins of cosmic rays.

The LHCf detectors are placed along the beam pipe just beyond the experiment cavern, at the point where the pipe splits into two. This location allows them to detect the neutral particles (or their decay products) that are emitted in the forward region and are not bent off course by the magnetic fields of ATLAS and the LHC magnets.

While the old generation of accelerators allowed researchers to verify the cosmic-ray disintegration models up to energies in the region of 1015 eV – LHCf will test them at energies of up to 1019 eV. Even if this year’s data is generated by lower-energy collisions, it will still be important as it will lie in the top-most region of data collected from previous experiments.

• Based on an article in CERN Bulletin 2008 issue 37–38.

LHCb: A question of asymmetry

LHCb experiment

Unlike the general-purpose detectors, the geometry of the LHCb experiment does not cover the full solid angle, but is developed along the forward direction with respect to the collision point. For 20 m a series of detector planes collects information on the particles coming from the collision point. This design is optimized for the study of B mesons, which, given their relatively small mass compared with the high energy of the LHC collisions, fly mostly in the forward direction.

B mesons have received increasing attention from theorists and experimentalists alike over recent years because their behaviour seems linked to various quantum phenomena that could shed light on new physics. “Today’s Standard Model of particle physics leaves many unanswered questions,” says Andrei Golutvin, spokesperson of the LHCb collaboration. He has recently taken over this role from Tatsuya Nakada who was the first spokesperson and a founder of the experiment. “A lot of physicists expect new physics to be just around the corner and already accessible at the LHC,” he continues. “General-purpose detectors like ATLAS and CMS will look for direct evidence of the existence of new particles. We have a different strategy. We focus on the study of B mesons, where some of their behaviour is very precisely predicted by the Standard Model. However small, a deviation from these predictions would indicate the existence of new phenomena.”

In recent years, two experiments at B-factories – BaBar at SLAC and Belle at KEK – have shown that the B particles are a key element in the process of understanding CP violation – the subtle asymmetry between matter and antimatter within the Standard Model. However, this does not seem to be enough to generate the absence of antimatter in the universe. “We will study with an unprecedented precision how CP violation takes place in the B-system,” explains Golutvin. “The yet undiscovered heavy particles could be a new source of CP violation that could affect the decays of B particles. The Bs mesons seem particularly interesting,” he continues. “Their loop-dominated decays are potentially very sensitive to new particles that could ‘enter’ in the loop virtually and cause observable effects. For example, if we find that the decay rate of the Bs to a particular final state, such as two muons, is higher than predicted by the Standard Model, it could be an indication of a contribution coming from Higgs bosons or supersymmetric particles.”

The LHC, with its high luminosity and high energy, will provide the LHCb collaboration with a particularly rich harvest of beauty particles

The LHC, with its high luminosity and high energy, will provide the LHCb collaboration with a particularly rich harvest of beauty particles, hundreds of times more than those made available by other accelerators to previous experiments. “Both BaBar and Belle, as well as CDF and D0 at the Tevatron proton–antiproton collider, made big contributions to flavour physics, the physics of processes that involve the transformation of quark flavours,” says Golutvin. “Now we know that the indirect contribution of new physics in CP violation is not big, certainly below the 10% level for the most of the decay modes. Thanks to the LHC performance, LHCb will be able to study very rare events and show possible new avenues to physics.”

In its 15-year history, the LHCb detector underwent one major layout modification. The modification – known as the “LHCb light” option – reduced the amount of material in the layers the particles cross, thus reducing the background produced by the interaction of primary particles with the material of the detector. “We work out the momentum of charged particles by measuring the bending angle after the dipole magnets. The original idea was to have additional detectors to follow the trajectory of particles inside the magnet, which means of course a more complicated detector,” Golutvin explains. “After an idea by Nakada and with the help of computer simulations, we understood that we could have very robust pattern recognition even without all those chambers.” The results was that about six years ago the LHCb collaboration decided to simplify detector a little by having no chambers in the magnet. “This minimizes the amount of material along the trajectories of particles and also simplifies the operation of the detector,” says Golutvin. “Besides that, there were a few other minor changes. For example, we decided to use a beryllium beam pipe also to minimize the background.”

LHCb is designed to run at a luminosity of a few times 1032 cm–2s–1, much smaller than the nominal LHC luminosity, 1032 cm–2s–1

During normal running of the LHC, one of the most beautiful and delicate subdetectors of LHCb, the VErtex LOcator (VELO), sits only 5 mm away from the beam. Its mission is to identify the vertices where the B mesons are produced and decayed. Given the number of particles that will be produced closed to the beam direction, the VELO will receive a great deal of radiation in a short time. “The current VELO will have to be replaced after 3 to 4 years of nominal operation,” confirms Golutvin. “The work on the replacement VELO modules started in July this year and should be completed by April 2010. As for the rest of the detector, it is designed to withstand the radiation during the initial physics programme.”

LHCb is designed to run at a luminosity of a few times 1032 cm–2s–1, much smaller than the nominal LHC luminosity, 1032 cm–2s–1. This will be achieved by focusing the beams less at the LHCb collision point. The collaboration is considering a possibility for a major upgrade to work at an order of magnitude higher luminosity, after the initial physics programme is completed in about five to six years from now.

As with the other experiments at the LHC, the LHCb collaborations will use the first run to understand and calibrate the various parts of the detector. After that, it will start physics analysis at the same time as ATLAS and CMS. So just what does the collaboration expect? “As expressed by many people, the following three possible situations would be very exciting for particle physics,” says Golutvin. “The first one is that ATLAS and CMS see some new physics and we don’t. This will be very exciting for them and may be not too much for us. Still, the physics community will have to explain why the new physics does not seem to affect the quantum loop, in order to understand the exact nature of the new physics. Then there is the second option: ATLAS and CMS don’t see new physics while we see a clear deviation from the Standard Model. This might happen if the new particles are very heavy. We would see their virtual effect but they could not be directly produced at the LHC energies in the other experiments. Of course, the best case is if all the experiments see new physics effects and a coherent scenario can be built for this new physics.”

Nature alone knows which of these scenarios will eventually occur, but it could be that new physics might emerge quickly in LHCb, so Golutvin and the LHCb collaboration remain very optimistic.

LHCb: A beauty of an experiment (archive)

CCLHb1_10_08

With preparations for the ATLAS and CMS large general-purpose detectors for CERN’s LHC collider now advancing, the initial cast for the LHC experimental programme is extended with the publication of a full technical proposal for the LHCb experiment. The aim of this experiment is to study in detail the physics of the Standard Model’s third (and final) generation of particles, particularly the beauty, or “b” quark contained in B mesons. This third generation of quarks makes possible the mysterious mechanism of CP violation.

When component quarks mutate under the action of the weak force, subtle effects come into play. The first to be discovered was the violation of parity (left–right mirror symmetry) in standard nuclear beta decay. This parity violation is seen even with the up–down quark doublet that makes up protons and neutrons.

Searching for a more reliable mirror to reflect particle interactions, physicists proposed CP symmetry. As well as switching left and right, such a mirror also switches particles and antiparticles – the CP mirror image of a right-handed particle is a left-handed antiparticle. However, having six quarks (arranged pair-wise in three generations) opens up the possibility of violating CP symmetry as well. Such effects had been seen in 1964 with neutral kaons. But these kaon phenomena are only a tiny corner of the Standard Model’s CP violation potential. Much larger effects should happen in the B sector. The race is now on to collect enough B particles to become the first to glimpse this additional CP violation.

While these will surely reveal more CP violation effects, the full picture will probably only emerge with the interaction rates and energy conditions of the LHC, which will considerably extend the B physics reach. As well as investigating all aspects of CP violation, LHCb would also consolidate our knowledge of particle reactions and explore fully all quark and lepton sectors of the Standard Model.

The LHCb experiment, which so far has attracted some 340 physicists from 40 research centres in 13 countries, aims to exploit the luminosity of 2×1032 per cm2/s which should be available from the LHC from Day 1. For the other experiments, the LHC’s collision luminosity will be cranked up to 1034. LHCb expects to harvest about 1012 b quark–antiquark pairs each year. LHCb is a large single-arm spectrometer covering an angular range from 10 out to 300 mrad and will be housed in the 27 km LHC/LEP tunnel in the Intersection 8 cavern nearest Geneva airport, currently the site of the Delphi experiment at the LEP electron–positron collider.

At the heart of the detector is the vertex detector, studied by a CERN/Amsterdam/Glasgow/Heidelberg/Imperial College London/Kiev/Lausanne/Liverpool/MPI Heidelberg/NIKHEF Amsterdam/Rome 1 team. The vertex detector will record the decays of the B particles, which travel only about 10 mm before decaying. Each of the 17 planes of silicon (radius 6 cm) spaced over a metre consists of two discs to measure radial and polar coordinates. The arrangement should provide a hit resolution between 6–18 microns and 40 microns for the impact parameter of high momentum tracks.

Downstream of the vertex detector, the tracking system reconstructs the trajectories of emerging particles. Using 11 stations spaced over about as many metres, this tracking uses a honeycomb of drift chambers on the outside (where the particle fluxes are lower), enclosing a finer granularity arrangement on the inside. Microstrip gas chambers with gaseous electron multiplication is the prime contender for this part of the detector, but silicon strips and micro-cathode strips are also being investigated. The inner tracker is being investigated by Heidelberg (University and MPI), PNPI St Petersburg and Santiago (Spain), and the outer by Dresden, Free University of Amsterdam, Freiburg, Humboldt Berlin, IHPE Beijing, NIKHEF Amsterdam and Utrecht.

LHCb’s 1.1 tesla superconducting dipole spectrometer magnet (studied by CERN and PSI Villigen) would benefit from the infrastructure developed for the Delphi magnet at LEP. The magnet polarity is reversible to help the systematic study of CP violation effects.

Particle identification is carried out using the ring-imaging Cerenkov (RICH) technique, with the first RICH equipped with a 5 cm silica aerogel and 1 m C4F10 gas radiators behind the vertex detector and the second station with 2 m of CF4 gas radiator behind the tracker. Cerenkov photons would be picked up by a hybrid photodiode array, the subject of a vigorous ongoing R&D programme. The RICH study group consists of Cambridge, CERN, Genoa, Glasgow, Imperial College London, Milan and Oxford.

Following the second RICH is the electromagnetic calorimeter for identifying and measuring electrons using a ‘shashlik’ structure of scintillator and lead read out by wavelength-shifting fibres. It has three annular regions with different granularities to optimize readout. Identification of these electromagnetic particles is facilitated by a lead-scintillator preshower detector. Electromagnetic calorimetry is studied by a Bologna/Clermont Ferrand/lNR Moscow/lTEP Moscow/Lebedev Moscow/Milan/Orsay/Rome l/Rome 2 team.

The hadron calorimeter (Bucharest/IHEP Moscow/Kharkov/Rome 1) is of scintillator tiles embedded in iron. Like the electromagnetic calorimeter upstream, it has three zones of granularity. Readout tests with a full-scale module prototype in a beam have already exceeded the expected performance of 50 photoelectrons per GeV. Downstream, shielded by the calorimetry, four layers of muon detector (Beijing/CERN/Hefei/Nanjing/PNPI/Shandong/Rio de Janeiro/Virginia) uses multigap resistive plate chambers and cathode pad chambers embedded in iron, with an additional plane of cathode pad chamber muon detectors mounted in front of the calorimeters. As well as muon identification, this provides important input for the triggering.

Data handling will use four levels of triggering (event selection), with initial (level 0) decisions based on a high transverse-momentum particle and using the calorimeter and muon systems. This reduces the 40 MHz input rate by a factor of 40. The next level trigger (level 1) is based on information from the vertex detector (to look for secondary vertices) and from tracking (essentially to confirm high transverse momentum) and reduces the data by a factor of 25 to an output rate of 40 kHz. Level 2, suppressing fake secondary decay vertices, achieves another further 8-fold compression. Level 3 reconstructs B decays to select specific decay channels, achieving another compression factor of 25 and data are written to tape at 2OO Hz. Data handling and offline computing are being looked at by Bologna, Cambridge, CERN, Clermont Ferrand, Heidelberg, Lausanne, Lebedev, Marseille, NIKHEF, Orsay, Oxford, Rice and Virginia.

• May 1998 pp3–5 (abridged).

 

Beauty at the LHC

The Standard Model of physics, with its picture of six quarks and leptons grouped in pairs into three generations, is coming under detailed scrutiny as physicists try to understand what makes it work so well. This demands precision probes of all quark channels, rare as well as familiar.

The LHC will be a prolific source of B particles containing the fifth (beauty, b) quark, either in beam–beam collisions or using one of the high energy proton beams in a fixed-target set-up. Obvious aims of the B-physics programme at the LHC are to investigate the mixing of neutral B mesons, the particle lifetimes and the spectroscopy of beauty baryons. However the main goal will be observing CP violation in the neutral B system (neutral mesons containing b with either d or s quarks).

CP violation – the subtle disregard of an otherwise perfect symmetry of a combined particle–antiparticle and left-right switch – has been known for 30 years and only seen in the decays of neutral kaons. Its origin is still a mystery but it is widely believed to be responsible for the universe’s matter-antimatter asymmetry. The Big Bang initially produced equal amounts of matter and antimatter but the tiny CP-violation mechanism was enough to tilt the balance in favour of matter as we know it.

To complement the B physics capabilities of LHC’s big detectors (ATLAS and CMS), one dedicated B physics experiment is planned for the initial phase of the LHC experimental programme. Three groups submitted Letters of lntent based on different experimental approaches:

• colliding beams at the full LHC 14 TeV collision energy (the COBEX project)

• an internal gas jet target intercepting a circulating beam at the fixed target energy of 114 GeV (the GAJET project)

• a beam extracted from the beam halo by a bent crystal and a septum magnet for a fixed target experiment (the LHB project).

Considering these ideas, the LHC Experiments Committee pointed out that when LHC comes on line, initial measurements of CP violation in the B meson system will have been made by several ongoing projects. The LHCb will therefore be a second-generation study. While identifying attractive features in all three Letters of lntent, the Committee was of the view that an experiment using the collider approach, handling the full production rate, is the most attractive.

The Committee, whose view was subsequently endorsed by the Research Board, encouraged all participants in the three Letters of Intent to join together to submit a fresh design for a collider-mode B experiment.

• September 1994 p10.

 

Birth of a collaboration

The stage being set for CERN’s LHC proton–proton collider includes a place for an experiment – LHC-B – to study the physics of B particles. The Letter of Intent for this experiment has been reviewed by the appropriate committees, who recommend that the collaboration should now proceed to a vigorous research and development programme for the various detector components en route to a full technical proposal.

By the time the LHC is operational, the B meson system will have been extensively studied elsewhere – in the B factories being built at SLAC (Stanford) and at KEK, Japan, at Cornell’s revamped CESR ring, at the HERA-B experiment at DESY, Hamburg, and at Fermilab’s Tevatron. The LHC-B experiment will therefore be a second-generation study. While all three initially submitted approaches had different appealing features, the collider route, exploiting the full B production rate, was thought to be the most attractive for mature physics. CERN therefore encouraged all participants in the initial B-physics ideas to collaborate in a fresh design for a collider-mode experiment. The result is the LHC-B collaboration, which currently groups almost 200 researchers from 40 institutes in 15 countries, and is growing.

• April/May 1996 pp2–4 (extract).

ALICE: The heavy-ion challenge

CCAli2_10_08

When the ideas for ALICE were first formed at the end of 1990, the heavy-ion programme was still in its infancy and very little was known about what physics to expect or what kind of detector would be required. Nevertheless, an expression of interest for a dedicated general-purpose heavy-ion detector was presented at Evian in 1992. “That’s the first appearance of ALICE,” recalls Jürgen Schukraft, who has been at the helm of the experiment since its inception in 1991. “We had to do enormous extrapolations because the LHC was a factor of 300 higher in centre-of-mass energy and a factor of 7 in beam mass compared with the light-ion programme, which started in 1986 at both the CERN SPS and the Brookhaven AGS. It was akin to planning for the International Linear Collider with a centre-of-mass energy of 1 TeV based on knowledge from Frascati’s ADONE machine, one of the first electron–positron colliders running at 3 GeV.”

Sixteen years later, the field of heavy ions is in a mature state. The ALICE collaboration has the benefit of results from the heavy-ion programmes at the SPS and at Brookhaven’s RHIC, to use as guidance, allowing an infinitely better idea of what to look for, as well as the kind of detectors and the precision needed. Heavy ions will collide at the LHC with energy levels 28 times higher than at RHIC and 300 times higher than at the SPS, representing a huge jump in energy density. “The field of heavy ions has gone from the periphery into a central activity of contemporary nuclear physics,” explains Schukraft. “The exciting thing about the LHC is that because of the huge jump in energy compared with RHIC, there are many open questions to be answered and lots of surprises to be expected. While we don’t know the answers yet, today at least we know some of the questions.”

ALICE will study the quark–gluon plasma (QGP), the first evidence of which was discovered at RHIC and the SPS, and will continue the investigations by confirming interpretations and testing predictions at the LHC. “Back in 1992, we were imagining what the quark–gluon plasma would look like and we expected it to behave like an ideal gas, but what we found is that it behaves like a perfect fluid, so it is completely different,” says Schukraft. “This was a very big surprise, because instead of being weakly interacting, or gas like, it is strongly interacting. It is the best fluid anyone has ever found in nature, much better than liquid helium, for example.” He adds: “The discovery that QCD matter is more like a fluid, was made at RHIC. We now expect to see it flow at about the same strength at the LHC if our understanding is correct – because it can’t get any better than ‘ideal’ – or we will be scratching our heads if it behaves differently.”

Another question on the minds of the ALICE collaboration is whether there is not only QGP, but yet another unusual state of matter called colour glass condensate (CGC), which may form at high gluon densities in heavy nuclei. While QGP is hot and dense, CGC is cold and dense, and would exist in the initial state – before the nuclei collide – and then melt away. “We hope to discover new aspects of QCD in the strongly coupled regime, where the strong interaction is actually strong,” says Schukraft. “One of the central concepts of the Standard Model is phase transition and spontaneous symmetry breaking. The QCD phase transition is the only one accessible to study by experiment and ALICE will measure its properties and parameters.”

As the field of heavy ions has unfolded, the ALICE collaborators have been flexible in changing or adding to their detector. Over the course of time, 50% of new detector components have been added to the original Letter of Intent submitted in the spring of 1993, as a result of the new data from the SPS and RHIC. This includes the muon spectrometer, a transition-radiation detector and the electromagnetic-jet calorimeter, scheduled to be completed in 2011. “Now we know better what we need for this new regime,” explains Schukraft. In addition, some detectors had to be invented from scratch – such as the time-of-flight detector, which was impossible to build at the time the original design was made, and silicon pixel detectors, which were not around then.

ALICE is expecting to receive 1 PB of data for the one month per year of heavy-ion operation, at a rate of more than 1.25 GB/s, which presents a huge challenge. According to Schukraft, state-of-the-art technology in data-collection infrastructure during the 1990s worked at a rate of 10 MB/s. “Most people thought 1 GB/s would be a real challenge to reach and that we would have to find a way to reduce the data volume. There were many discussions on how to handle this huge amount of data, yet today within a factor of 2–3 it is quite common. However, 15 years ago one could not dream of handling such a large amount of data at such a rapid rate,” he says. He expects that the heavy-ion data taking will start by the end of 2009 and soon after begin to show the first interesting results.

Although the collaboration’s main interest is heavy-ion collisions, for most of the year ALICE will be running using proton–proton collisions, which is important for comparing measurements from the lead–lead collisions. The detectors are optimized for complete particle identification at angles close to 90°, detecting particles from extremely low to fairly high momentum. During the proton runs, ALICE collaborators will be tuning the Monte Carlo generators and evaluating the background and detector performance for QCD measurements, such as charm and beauty production at low transverse momentum.

“What we are doing at the LHC is very exciting,” says Schukraft. “The LHC is really amazing in its ability to combine three different approaches in one machine: high-energy phenomena, producing new particles to be studied by ATLAS and CMS; indirect effects of virtual high-mass particles, studied in LHCb; and distributed energy that heats and melts matter, to be studied by ALICE. We look forward to studying lead–lead collisions at LHC energy scales.”

ALICE: New kid on the block (archive)

CCAli1_10_08

In the children’s story, Alice chased a white rabbit down a hole to find herself transported to a magical world. At the LHC, ALICE (A Large Ion Collider Experiment) will be pursuing new states of matter, and the wonderland to be found could be every bit as new and exciting. The LHC will continue CERN’s tradition of diverse beams, being able to accelerate not only protons, but also high-energy beams of lead ions. It is this capability which ALICE is designed to exploit.

The idea of building a dedicated heavy-ion detector for the LHC was first aired at the historic Evian meeting in March 1992. From the ideas presented there, the ALICE collaboration was formed, and in 1993, a Letter of Intent was submitted. High-energy heavy-ion collisions provide a unique laboratory for the study of strongly interacting particles. Quantum chromodynamics (QCD) predicts that at sufficiently high energy densities there will be a phase transition from conventional hadronic matter, where quarks are locked inside nuclear particles, to a plasma of deconfined quarks and gluons. The reverse of this transition is believed to have taken place when the universe was just 10–5 s old, and may still play a role today in the hearts of collapsing neutron stars.

The feasibility of this kind of research was clearly demonstrated at CERN and Brookhaven with lighter ions in the 1980s. Today’s programme at these laboratories has moved on to heavy ions, and is just reaching the energy threshold at which the phase transition is expected to occur. This physics reach will be extended at the RHIC heavy ion collider at Brookhaven, scheduled to come into operation in 1999. The LHC, with a centre-of-mass energy around 5.5 TeV/nucleon, will push the energy reach even further.

ALICE is bringing members of CERN’s existing heavy-ion community together with a number of groups new to the field drawn from both nuclear and high-energy physics. By LHC standards, the detector is of moderate proportions, being based on the current magnet of LEP’s L3 experiment. When LEP switches off, the L3 magnet will be left in place whilst ALICE is installed. LHC beams will pass through the magnet slightly off-centre, 30 cm higher than the current LEP beams.

On the trail of quark-gluon plasma

Because the physics of the quark-gluon plasma could be very different from that of ordinary matter, the ALICE detector has been designed to cover the full range of possible signatures, whilst being flexible enough to allow future upgrades guided by early results. The detector consists of two main parts, a central detector, embedded within the magnet, and a forward muon spectrometer included as an addendum to the Letter of Intent in 1995. The set-up is completed by zero-degree calorimeters located far downstream in the machine tunnel, to intercept particles emerging very close to the colliding beams.

One of the greatest challenges of heavy-ion physics is to pick out the individual tracks from the dense forest of emerging particles. ALICE’s tracking system has been designed for safe and robust pattern recognition within a large volume solenoid producing a weak field. The L3 magnet with a field of 0.2 tesla is ideal for the purpose.

The Inner Tracking System, lTS, consists of six cylindrical layers of highly accurate position-sensitive detectors from radii of 3.9 cm to 45 cm extending to ±45°. Its functions are secondary vertex recognition, particle identification, tracking, and improving the overall momentum resolution. The different layers are optimized for efficient pattern recognition. Because of the high particle density in the innermost regions, the first four layers provide position information in two dimensions. The first two layers are silicon pixel detectors, and the second two are silicon drift detectors. The two outermost layers will be composed of double sided silicon micro-strip detectors. The complexity and importance of this device is reflected in the number of institutions responsible for its production: Bari, Catania, CERN, Heidelberg, Kharkov, Kiev, Nantes, NIKHEF, Padua, Rez, Rome, St Petersburg, Salerno, Strasbourg, Turin, Trieste and Utrecht.

Central tracking is completed by a Time Projection Chamber, TPC, being built by Bratislava, CERN, Cracow, Darmstadt, Frankfurt, and Lund. Proven technology has been chosen to guarantee reliable performance at extremely high multiplicity. The drawbacks of this technology are high data volumes and relatively low speed. The TPC occupies the radial region from 90 cm to 250 cm, and is designed to give a rate-of-energy-loss resolution of better than 7%. It will also serve to identify electrons with momenta up to 2.5 GeV/c.

Identification parade

Two different technologies are under study for the last sub-detector to cover the full azimuthal angle, the particle identification system, PlD. Pestov spark counters, single-gap gas filled parallel-plate devices, are being investigated by Darmstadt, Dubna, Marburg, Moscow-ITEP, Moscow-MePHI, and Novosibirsk, whilst parallel plate chambers, PPCs, are being developed by CERN, Moscow-ITEP, Moscow-MePHI, and Novosibirsk. The final design is expected to be complete by the end of 1998. The PPCs are less demanding to construct and operate, but the Pestov counters give a timing resolution of less than 50 ps, some four times better than PPCs.

A second particle identification device for higher momentum particles, the HMPID, is included in the design as a single arm device above the central PlD. A ring-imaging Cerenkov (RICH) detector is the preferred option, being developed by Bari, CERN, Zagreb, and Moscow-INR. However, an organic scintillator approach being pursued by Catania, and Dubna has not yet been ruled out.

Below the central barrel region of the detector is another single-arm device, the photon spectrometer, PHOS, to measure prompt photons and neutral mesons. It is being prepared by Bergen, Heidelberg, Moscow-Kurchatov, Munster, Protvino, and Prague using scintillating lead tungstate crystals developed in the context of CERN’s generic detector R&D effort.

Zero-degree calorimeters, ZDC, will be positioned 92 m from the interaction point to measure the energy carried away by non-interacting beam nucleons, a quantity directly related to the collision geometry. These are calorimeters of the spaghetti type, with quartz fibres as the active medium. Their construction is the responsibility of Turin. Another forward detector, the forward multiplicity detector, FMD, will be embedded within the solenoid with the purpose of providing fast trigger signals and multiplicity information outside the central acceptance of the detector. Innovative micro-channel plate detectors are under consideration by Moscow-Kurchatov and St Petersburg, with conventional silicon multipad detectors as a back-up.

The forward muon spectrometer, FMS, is a major addition to the original design as specified in the Letter of Intent. It was included to measure the complete spectrum of heavy-quark resonances, which are expected to provide a sensitive signal for the production of a quark-gluon plasma. The first section of the spectrometer is an absorber placed inside the solenoid about 1 m from the interaction point. This is followed by a large 3 tesla dipole magnet outside the solenoid containing 10 planes of tracking stations. A second absorber and two further tracking planes provide muon identification and triggering. Teams from CERN, Clermont-Ferrand, Gatchina, Moscow-Kurchatov, Moscow-INR, Nantes, and Orsay are working on a more detailed design for the FMS, which is expected later this year.

Triggering is the responsibility of Birmingham and Kosice. Proton–proton mode and ion–ion mode have different trigger requirements. In proton–proton mode, a minimum bias trigger is required, whilst for ion–ion collisions, the trigger’s function is to select on collision centrality. A level zero trigger decision is made at around 1.2 microseconds based on centrality information from the FMD. At level-one (2 microseconds) this is supplemented by the ZDC. A dimuon trigger from the FMD also contributes to level-one. The final level-two trigger decision is made after further processing at 100 microseconds.

The architecture of the ALICE data acquisition system is determined by the relatively short heavy-ion runs foreseen for the LHC, roughly 10% of each year’s running. The collaboration will have ten times as long to analyse the data as they have to collect them, and so a high bandwidth system is envisaged in order to collect as much data as possible in the time available. CPU-intensive operations such as event filtering and reconstruction will be performed offline. Data acquisition is the responsibility of Budapest, CERN, and Oslo.

• March 1996 pp9–12 (abridged).

 

Green light for ALICE

ALICE has received the green light to proceed towards final design and construction. ALICE is the natural continuation, at CERN of the SPS Heavy Ion programme, initiated in 1986, which has recently provided exciting new results in the quest for the quark-gluon plasma.

Up to 50,000 charged particles are expected to be emitted in a lead–lead collision at the LHC of which about ten thousand will go through the ALICE central detector. That is why the central tracking in ALICE is based on the Time Projection Chamber (TPC) technique, which has already proven its value in registering tracks in a high multiplicity environment within the NA49 SPS experiment. The LHC collision rate in heavy ion mode is compatible with TPC drift times of around 100 microsec.

In the forward direction, within a 9 degree angle around the beam, ALICE will be equipped with a muon spectrometer, made of a sophisticated hadron absorber, a dipole magnet, five tracking stations (made of Cathode/pad Strip Chambers) and two trigger stations (made of Resistive Plate Chambers). Measurements on muon pairs are an essential part of the ALICE physics programme, since heavy dileptons probe the early stages of the produced medium.

• April 1997 pp4–5 (extract).

CMS: Building on innovation

From the beginning, the CMS collaboration had taken a new approach with the plan to assemble the detector above ground in a spacious surface building while the civil engineering work on the underground cavern was underway. Alain Hervé, who had been Technical Coordinator for the L3 experiment at LEP before taking up the same position with CMS, strongly recommended constructing the detector in slices that would be lowered down the 100 m shaft into the cavern after extensive commissioning on the surface. This had never been done before for such a large-scale high-energy physics experiment, most experiments being constructed directly in the experimental area. This decision, and the requirement of the ease of maintenance, determined the overall structure of the detector, with slices that could be lowered one by one – 15 heavy pieces in all.

“It is very unusual to do this, but the surface building was made quite large, and we could work on several pieces at the same time because they could easily be moved back and forth. Also the underground civil-engineering work in the caverns would take time, so we started assembling the detector four to five years before the underground cavern was finished. The fully tested elements were lowered underground between November 2006 and January 2008. The experiment is commissioned and now ready for data-taking. The duration of the lowering operation and commissioning was essentially that foreseen 17 years ago,” explains Jim Virdee, who has been with CMS since the very beginning and spokesperson since 2007. “I know a few future experiments are looking at this way of doing things,” he adds, “so I think it might catch on. It gives a lot of flexibility, providing ease of maintenance and installation. Even late on we could work on various elements in parallel in the underground cavern.”

A lot of people thought we had left it too late, and I was being advised that we were taking a risk, but it was a risk we had to take.

Jim Virdee

The long process from the design phase to final construction encompassed some crucial changes in technology, which allowed savings in time, money and effort. Despite the unexpected challenges that arose, the collaboration remained flexible and creative in solving them. “We needed radiation-hard electronics in our tracker, electromagnetic calorimeter and hadron calorimeters, along with radiation-tolerant muon systems. We did a lot of R&D on this with industries that had produced radiation-hard electronics, usually for space or military applications,” recalls Virdee. The collaboration was ready to launch production of the front-end electronics of the inner tracker when the foundry that was going to produce the electronics moved, and somehow lost its ability to produce electronics with good radiation hardness. “So we were thrown back to the drawing board and had to develop a new way of obtaining radiation-hard electronics,” says Virdee. “We essentially changed all of our on-detector electronics for the tracker and the electromagnetic calorimeter. This was a major issue that we were confronted with in the late 1990s and it’s all worked out very well. A lot of people thought we had left it too late, and I was being advised that we were taking a risk, but it was a risk we had to take.”

Another significant challenge concerned the production of 75,000 lead-tungstate crystals in Russia and China. These were chosen for their compactness, owing to their short radiation length, and high radiation hardness, but early tests revealed problems when using silicon photodiodes, with the scintillation light being drowned out by unwanted signals arising from charged particles at the end of the shower passing through the photodiodes. A solution was discovered using silicon-avalanche photodiodes, which could work in a magnetic field. Working with the crystal supplier in Russia also proved interesting. “The economic conditions in Russia have changed a lot since we started producing the crystals,” says Virdee, “so much so that we had to place the last few orders in roubles, not in dollars any longer because the rouble was considered by the manufacturer to be a more stable and stronger currency!”

In 1999 the CMS collaboration made a major decision to change the design of their inner tracker. Originally, they had included both microstrip gas chambers (MSGCs) and silicon sensors after performing much R&D on various technological options. The cost-per- square-centimetre of silicon detectors in the early 1990s was high, so the plan was to use silicon detectors close to the interaction point and use MSGCs further away. “This technology required some development to make it suitable for use in the LHC, and essentially we succeeded in doing that,” says Virdee. However, development of silicon detectors continued during the decade. Larger wafers were becoming available at a competitive cost and with improved performance. Furthermore, automation – employed in the electronics industry – allowed rapid and reliable production of the 17,000 silicon modules needed for the tracker.

The collaboration took the bold decision based on practical aspects to use only silicon.

At the beginning of 1999, when it was clear that silicon had reached a competitive state with the MSGCs, the collaboration took the bold decision based on practical aspects to use only silicon. “We were pressed for time, and having two different technologies required us to have two different systems doing similar work. At the time we had not invested as much effort in the systems issues as we would have wished for,” Virdee explains. “So one of the key issues that arose was: can we come up with a single design to simplify the work and save time? The basic issue was that the silicon detectors were of high quality, and were mass-produced by industry, so we could just buy them while high-rate production lines for MSGCs had still to be commissioned.”

Once the LHC starts, the CMS physicists, some of whom have spent most of their working lives building the large and complex subdetectors, will have the long-awaited chance for discoveries. “However, before we do that we need to verify that the subdetectors perform as designed. Currently, we are doing that by running with cosmic rays. As far as we can tell the detector is working as expected and this is very encouraging. The moment of truth, however, will be when we record collision data,” says Virdee. “This start-up is very exciting because we are making a big leap up in energy and entering a new regime. All indications are that there is something special about this energy range.”

bright-rec iop pub iop-science physcis connect