Comsol -leaderboard other pages

Topics

LHC restart impresses Council

CCnew1_01_10

At its 153rd session on 18 December, the CERN Council heard that the LHC had ended its first full period of operation two days earlier, following collisions at a total energy of 2.36 TeV – a world record. The LHC circulated its first beams of 2009 on 20 November, ushering in a remarkably rapid beam-commissioning phase (The LHC is back: a remarkable four weeks). The first collisions were recorded on 23 November, and the world-record beam energy was established on 30 November. Following these milestones, a systematic phase of commissioning led to a period in which the six LHC experiments recorded more than a million collision events, which were distributed for analysis around the world on the LHC Computing Grid.

At the end of this first period of running, the LHC went into standby mode for a short technical stop to allow preparations for higher energy running after a restart scheduled for February. In November teams had commissioned and tested the magnet powering up to 2 kA, which corresponds to a beam energy of 1.18 TeV. To run at higher energies requires higher currents, placing more exacting demands on the new machine protection systems, which need to be readied for the task. Commissioning work for higher energies has been under way throughout January, together with necessary adaptations to the hardware and software of the protections systems that have come to light during the 2009 run.

CCnew2_01_10

“Council is extremely pleased and impressed by the way the LHC, the experiments and the Computing Grid have operated this year,” said Council president Torsten Åkesson. “The laboratory set itself an ambitious but realistic programme at its February [2009] planning meeting. The fact that all the objectives set back then have been achieved is a ringing endorsement of the step-by-step approach adopted by CERN management.”

Other Council business included the question of geographic enlargement of CERN. Council heard from a working group established in 2008 to examine this question, and accepted a series of guiding principles concerning such an enlargement, with a possible associate status involving balanced benefits and obligations being developed. In parallel, CERN has received five applications for membership over the past 12 months. Council decided to establish a working group to undertake the tasks of technical verification and fact-finding relating to these applications.

At the end of the meeting, Åkesson handed over the Council’s presidency to Michel Spiro, director of the National Institute for Nuclear Physics and Particle Physics (IN2P3) at the Department of Nuclear and Particle Physics and of the National Centre for Scientific Research (CNRS) in France. “I am greatly honoured to have been elected president of the CERN Council,” said Spiro. “I will be the Council’s 20th president, and it is with humility that I take up the mantle of my illustrious predecessors, not least Professor Åkesson, who has made significant progress with the organization over the term of his mandate. With the first results from the LHC eagerly anticipated, the period ahead promises to be a golden era: it is these results that will shape the future of particle physics and of CERN.”

Protons are back in the LHC

CCnew1_09_09

During the last weekend of October, particles once again entered the LHC after the one-year interruption following the incident of September 2008, travelling through one sector in each direction – clockwise and anticlockwise. ALICE and LHCb, the two experiments sitting along the portion of the beam lines in question, were able to observe the effects of beams in the machine. A week later, at around 8 p.m. on 7 November, protons travelling anticlockwise arrived at the doorstep of the CMS experiment, thus completing half of the journey around the LHC.

On 23 October, a first beam of ions entered the clockwise beam pipe of the LHC. Previous tests, on 25–26 September, had involved injecting lead-ion beams through the whole injection chain right up to the threshold of the LHC. This time, the lead ions entered the LHC just before point 2, where the ALICE experiment is installed, and were dumped before point 3. These tests enabled the machine experts to test the operation of the whole injection chain and an entire sector (sector 1-2) of the LHC.

Several sub-detectors of the ALICE experiment were switched on and saw their first beam. This helped them synchronize with the LHC clock and test the capability of the sub-detectors to measure high particle multiplicities.

During the afternoon on the following day, the first proton beam made its way through the TI8 transfer line up to the anticlockwise beam pipe of the LHC. Protons passed through the LHCb experiment and were dumped just before point 7.

Most of the LHCb sub-detectors were switched off to keep the experiment safe during these delicate operations. Only the beam and background monitors remained switched on, allowing an opportunity for commissioning of the beam-monitoring software. A highlight of the weekend was the switching on of the LHCb magnet, with operators able to measure its effect on the LHC beam and adjust the magnetic compensators around LHCb accordingly to bring the beam back into orbit.

The first weekend of November saw protons complete their journey anticlockwise through three octants before being dumped in collimators just prior to entry to the cavern of the CMS experiment. The particles produced by the impact of the protons on the tertiary collimators (used to stop the beam) left their tracks in the calorimeters and the muon chambers of the experiment. The more delicate inner detectors remained switched off for protection reasons.

During the same weekend, bunches of protons were also sent in the clockwise direction, passing through the ALICE detector before being dumped at point 3.

Hardware commissioning and magnet-powering tests have also continued in the LHC. By the first week in November, six of the eight sectors had been commissioned up to 2 kA, sufficient to guide a beam at an energy of about 1.2 TeV. Furthermore, the qualification of the new quench-protection system is progressing well, with the measured values complying with the stringent standards.

• CERN publishes regular updates on the LHC in its internal Bulletin, available at www.cern.ch/bulletin, as well as on the main site www.cern.ch, via twitter at www.twitter.com/cern and on YouTube at www.youtube.com/cern.

Science begins at SLAC’s new light source

CCnew2_09_09

The first experiments are now under way using the world’s most powerful X-ray laser, the Linac Coherent Light Source (LCLS), located at the SLAC National Accelerator Laboratory. With 10,000 million times the brightness of any other man-made X-ray source, the light from the LCLS can resolve detail the size of atoms, enabling the facility to break new ground in research in many fields including physics, structural biology, energy science and chemistry.

The LCLS takes short pulses of electrons accelerated in SLAC’s linac and directs them through a 100 m stretch of alternating magnets that force the electrons to slalom back and forth. This motion makes the electrons emit X-rays, which become synchronized as they interact with the electron pulses, thus creating the world’s brightest X-ray laser pulse. Each of these laser pulses has 1012 X-ray photons in a bunch only 100 fs long.

Commissioning assisted by users is currently under way, with experiments taking place using the Atomic, Molecular and Optical (AMO) science instrument, the first of six instruments planned for the LCLS. In these first experiments, the researchers are using X-rays from the LCLS to gain an in-depth understanding of how the ultrabright beam interacts with matter.

Early studies are revealing new insights into the fundamentals of atomic physics and have successfully proved the machine’s capabilities to control and manipulate the underlying properties of atoms and molecules. Researchers have used the pulses from the LCLS to strip neon atoms completely of all of their electrons. They have also watched for two-photon ionization. This is normally difficult to observe at X-ray facilities, but the extreme brightness of the laser beam at the LCLS makes the study of these events possible.

Future AMO experiments will create stop-action movies of molecules in motion. The quick, short, repetitive X-ray bursts from the LCLS enable experiments to form images as molecules move and interact. By stringing together many such images to make a movie, researchers will be able to watch the molecules of life in action, view chemical bonds forming and breaking in real time and see how materials work on the quantum level.

The LCLS is a testament to SLAC’s leadership in accelerator technology. Four decades ago, the laboratory’s 3 km-long linear accelerator began to reveal the inner structure of the proton. Now, this same machine has been revitalized for pioneering research at the LCLS. By 2013, all six LCLS scientific instruments will be on-line and operational, providing unprecedented tools for a range of research in material science, medicine, chemistry, energy science, physics, biology and environmental science.

Florida lab is to build high-field ‘supermagnet’

The National High Magnetic Field Laboratory at Florida State University has been awarded nearly $3 million to build a high-temperature superconducting magnet that will break records for magnetic field strength by aiming to reach 32 T. Around 8 km of cable formed from the high-temperature superconductor yttrium barium copper oxide, or YBCO, will go into the construction of the new magnet.

Superconducting magnets are well known in the world of particle accelerators (reaching a field of more than 8 T in the LHC, for example) and in magnetic-resonance imaging in hospitals (with fields of 1–3 T). They are also commonly used in high-field research, where one benefit is that they create more stable fields than do resistive magnets.

While superconducting magnets use a lot less electricity than their resistive counterparts, they traditionally operate at low temperatures that require costly cryogens. The high-temperature superconductor YBCO produces magnets that are not only cheaper to operate, but ones that do so at magnetic fields above about 23 T, where low-temperature superconducting magnets cease to work.

The construction of the 32 T magnet is funded by a grant of $2 million from the National Science Foundation and $1 million from Florida State University. The aim is to develop and demonstrate that technology will allow superconducting magnets to replace the resistive magnets in the National High Magnetic Field Laboratory.

NDCX-II project commencing at LBNL

CCnew3_09_09

Construction is beginning on the second-generation Neutralized Drift Compression eXperiment (NDCX-II), a new high-current, modest-kinetic-energy accelerator at Lawrence Berkeley National Laboratory (LBNL). The machine’s ion beams will enable studies of the poorly understood “warm dense matter” regime of temperatures around 10,000 K and densities near solid (as in the cores of giant planets). NDCX-II will also allow exploration of important issues in inertial-fusion target physics.

These studies support the ultimate goal of using ion beams to heat deuterium/tritium fuel to ignition in a future inertial fusion power reactor (a role for which accelerators appear well suited). NDCX-II has received $11 million of funding from the American Recovery and Reinvestment Act. Construction began in July with completion of the initial 15-cell configuration anticipated in March 2012.

NDCX-II will accelerate a beam of 30–50 nC of Li+ ions to 1.5–4 MeV and compress it into a pulse around 1 ns long. The short, high-current pulse is important for applications requiring efficient stopping of ions for rapid heating of a small amount of matter. As with the existing NDCX-I, the new machine uses neutralized drift compression. In this process, the beam’s tail is given a higher velocity than its head, so that it shortens while it drifts in a plasma that provides electrons to cancel space–charge forces.

The figure shows the layout of the machine. It will make extensive use of induction cells (accelerating elements) and other parts from the decommissioned Advanced Test Accelerator (ATA) at Lawrence Livermore National Laboratory (LLNL). It will be extensible and reconfigurable. In the configuration that has received the most emphasis, each pulse will deliver Li+ ions at 3 MeV into a millimetre-diameter spot onto a thin-foil target. Pulse compression to around 1 ns begins in the accelerator and finishes in the drift compression line.

CCnew4_09_09

NDCX-II employs novel beam dynamics to achieve unprecedented rapid pulse compression in a short ion accelerator. The 200 kV charged transmission-line pulsed-power voltage sources from ATA, known as “Blumleins”, can provide voltage pulses that are not longer than 70 ns. These are shown as blue cylinders in the figure. For them to be usable, it is necessary to reduce the ion bunch duration from its original 500 ns. This shortening is accomplished in an initial stage of non-neutral drift compression, downstream of the injector and the first few induction cells (note the spaces between induction cells at the left end of the figure). Long-pulse voltage generators are used at the front end; Blumleins power the rest of the acceleration.

Extensive particle-in-cell computer simulation studies have enabled an attractive physics design that meets the stringent cost goal. Snapshots from a simulation video are shown in the figure. Studies on a dedicated test stand are examining the ATA hardware and supporting the development of new pulsed solenoids that will provide transverse beam confinement.

Applications of this facility will include studies of warm dense matter using uniform, volumetric ion-heating of thin foil targets, and studies of ion energy coupling into an expanding plasma (such as occurs in an inertial fusion target). NDCX-II will also enable a better understanding of space-charge-dominated ion-beam dynamics and of beam behaviour in plasmas. The machine will complement facilities at GSI in Darmstadt, but will employ lower ion kinetic energies and commensurately shorter stopping ranges in matter.

NDCX-II will contribute to the long-term goal of electric power production via heavy-ion inertial fusion. In inertial fusion, a target containing fusion fuel is heated by energetic driver beams and undergoes a miniature thermonuclear explosion. The largest inertial confinement facility is Livermore’s National Ignition Facility (NIF). NIF is expected to establish the fundamental feasibility of fusion ignition on the laboratory scale. Heavy-ion accelerators offer efficient conversion of input power into beam energy, are long-lived, and can use magnetic fields for final focusing onto a target. These attributes make them attractive candidates for a power plant. The beams in such a system will require manipulations similar to those being pioneered on NDCX-II.

• NDCX-II is sponsored by the US Department of Energy’s Office of Fusion Energy Sciences. It is being developed by a collaboration known as the Virtual National Laboratory for Heavy Ion Fusion Science, including LBNL, LLNL and the Princeton Plasma Physics Laboratory.

Heidelberg Ion Therapy Centre opens

CCnew5_09_09

The Heidelberg Ion Therapy Centre (HIT) celebrated its opening at the Heidelberg University Hospital on 2 November. Developed with scientists and engineers at GSI in Darmstadt, the novel ion-beam cancer therapy facility is now ready to treat large numbers of patients, some 1300 a year.

HIT uses beams of ions, i.e. positively charged carbon or hydrogen atoms, which penetrate the body and exert their full impact deep within the tissue. To reach the tumour tissue, the ion beams are accelerated and then steered with such precision that they can irradiate a tumour the size of a tennis ball with millimetre accuracy, point by point. The surrounding healthy tissue remains mostly unaffected, so the method is particularly suited for treating deep-seated tumours that are close to vital or important organs such as the brain stem or the optic nerve.

The new facility has grown out of pioneering work at GSI, which has conducted fundamental research in radiobiology, nuclear physics and accelerator technology for therapeutic uses since 1980. The construction of a pilot ion-therapy project at GSI began in 1993 in a collaboration between GSI, the Heidelberg University Hospital, the Deutsches Krebsforschungszentrum in Heidelberg and the Forschungszentrum Dresden-Rossendorf.

At the same time, plans were made to introduce ion-beam therapy as a regular component of patient care with a new clinical facility at Heidelberg. HIT thus represents a direct transfer of technology from the GSI pilot project, which introduced several innovative techniques. These included: the raster scan method, which allowed tailored tumour irradiation with a carbon-ion beam; an accelerator that permits rapid variation in the energy of the ion beam in order to adjust the penetration depth inside a tumour; a fast control system to steer the ion beam safely inside the patient at millisecond intervals; and monitoring of the irradiation through a positron emission tomography (PET) camera, to make sure the beam hits the tumour.

Since 1997, 440 patients, most of them with tumours at the base of the skull, have been treated with carbon ion beams at the GSI facility. Clinical studies proved the success of the treatment, documenting a cure rate of up to 90%. Ion-beam treatment is now an accepted therapy, with health-insurance providers refunding the costs.

The new treatment centre is operated by the Heidelberg University Hospital, where a special building with a floor space of 60 m × 80 m was constructed to host it. The facility has a 5 m long linear accelerator and a synchrotron with a diameter of 20 m. Three treatment spaces are located adjacent to the accelerators, two of which are a development of technology used at GSI. The third treatment space features a gantry – a rotating ion-beam guidance system – that is a direct advance on the prototype developed at GSI. The gantry allows the ion beam to be aimed at a patient’s tumour at any angle, thus greatly enhancing the treatment options.

The ion-beam cancer treatment available at HIT is the first of its kind. Japan is currently the only other country offering ion-beam cancer therapy, but with a less effective irradiation technique. In the scope of a licence agreement between the GSI and Siemens AG, two more facilities modelled on HIT are under construction in Marburg and Kiel.

The continuing rise of micropattern detectors

CCmpd1_09_09

The invention of micropattern gaseous detectors (MPGDs), in particular the gas electron multiplier (GEM) by Fabio Sauli and the micromesh gaseous structure (Micromegas) by Ioannis Giomataris, has triggered a range of active research and development on a new generation of gaseous detectors. These technologies, together with other new micropattern detector schemes that have arisen from these initial ideas, are now enabling the development of detectors with unprecedented spatial resolution and high-rate capability, which also have large sensitive areas and exhibit operational stability and increased radiation hardness. Many groups worldwide are developing MPGD devices for future experiments, not only at particle accelerators but also in nuclear and astroparticle physics, as well as for applications such as medical imaging, material science and security inspection.

CCmpd2_09_09

This range of activity was the subject of the first international conference on MPGDs, which was organized at the Orthodox Academy of Crete, in Kolymbari, Greece, on 12–15 June 2009. The RD51 collaboration, which was established to advance the technological development and application of MPGDs, actively participated in the conference and held its collaboration meeting immediately afterwards on 16–17 June. The Orthodox Academy conference centre offered an ideal environment for the detailed examination of MPGD issues, together with the exchange of ideas and lively discussions that took place in both meetings. Crete is after all where, according to the myths of Daedalus and Talos, technology emerged during the Minoan civilization.

From COMPASS to the ILC

The history of MPGDs is much shorter, but nevertheless it is already rich in results and prospects. In 1999 COMPASS at CERN became the first high-energy physics experiment to use large-area Micromegas and GEM detectors in high-rate hadron beams. Micromegas produced with the new “microbulk” technology have backgrounds of a few 10–7 counts/s/keV/cm2. They might allow big improvements in the research potential of experiments that are searching for rare events (such as CAST, MIMAC and NEXT). Three time-projection chambers (TPCs) developed for the Tokai to Kamioka (T2K) project are using large pixellized Micromegas made using bulk technology to read out data from some 80,000 channels. This promising neutrino oscillation experiment reported impressive technological progress and results. Meanwhile, GEMs are about to be used in the TOTEM experiment at the LHC.

Review talks on future accelerators and upgrades, in particular the sLHC and the International Linear Collider (ILC) projects, covered the physics potential and set the requirements for detectors. MPGDs are in a favourable position thanks to their excellent properties. Research and development has already begun on a pixellized tracker (namely GridPix, or the Gas On Slimmed Silicon Pixels [GOSSIP] detector) for the upgrades of the LHC experiments, aiming for a spatial precision of around 20 μm. MPGDs are also good candidates for upgrading end-cap muon detection (with a precision of around 25 μm). Detectors with large surface areas pose a serious problem, however, owing to the huge number of read-out channels. A modified MPGD with controlled charge dispersion on a resistive anode-film laminated above the read-out plane would allow wide pads (about 2.7 mm), thus reducing significantly the number of channels.

GEMs and variations of Micromegas are being designed for digital hadron calorimetry and for TPCs and their read-out electronics at the ILC. The spatial resolution, which is not affected by a magnetic field, is reaching a record 50 μm for this application. The ion feedback suppression offered by the MPGDs is particularly important for operation at high rates. The new development of an integrated Micromegas (INGRID) on top of silicon micropixel anodes offers a novel and challenging read-out solution, and is under study both for a TPC at the ILC and for a vertex detector for ATLAS. Recent results using a triple-GEM structure combined with either Medipix or Timepix read-out electronics were also presented at the conference.

Multiple applications

Moving away from applications in particle physics, the strip-resistive-electrode thick GEM (S-RETGEM) could be used as flame/smoke detectors for the detection of forest fires at distances up to 1 km, compared with a range of about 200 m for commercially available UV-flame detectors. A detector structure inspired by the Micromegas concept, the Parallel Ionization Multiplier (PIM-MPGD), is being developed in collaboration with medical researchers for use in radio-pharmaceutical β-imaging, with a spatial resolution of 30 μm.

X-ray polarimetry for astrophysical applications now has a powerful tool, with intense development work on GEMs and thick GEMs (THGEMs) based on the pure noble gases xenon, argon, and neon. Interesting developments on GEMs and micropixel (μPIC) detectors operating as large-area VUV gas photomultiplier tubes were also presented at the conference. THGEMs are being assessed for applications in ring-imaging Cherenkov detectors and are also being used in a novel nuclear-imaging technique (3γ imaging) for medical purposes.

The construction of MPGDs is now moving away from planar geometry, but not without difficulties. Cylindrical Micromegas, as used in the CLAS12 detector at Jefferson Lab, and the triple-GEM structure developed for the KLOE experiment at Frascati, do not lose their performance compared with planar ones. Spherical GEMs are also being tested to fight parallax effects that pose limitations in many applications.

Rui De Oliveira of CERN presented the excellent research, development and innovation taking place at CERN in close collaboration with the GEM and Micromegas groups. He presented new photolithography and etching techniques that aim to improve several aspects of the performance of MPGDs, e.g. in robustness, homogeneity, sparking and electronics integration. MPGDs are now being manufactured with areas larger than around 0.5 m2, but further developments are needed for detectors for the sLHC and ILC. Industry has quickly become involved in MPGDs; several companies in Europe, Japan and the US are already manufacturing MPGD elements.

CCmpd3_09_09

The conference proved the ideal occasion for discussions about the common aspects of all of the variations of MPGDs: field mapping, simulations, gases, electronics etc. All of the groups involved, and the two communities, GEM and Micromegas, came together in a fruitful collaboration. In addition, they were able to sample some of the beauty of Crete, present and past, with two special lectures, one on the history of Crete and the city of Chania, and one on Cretan flora. Participants also enjoyed walking excursions in the gorge of Samaria or visiting the archaeological site of Knossos. The conference dinner featured local delicacies, traditional Greek and Cretan music as well as dancing.

Looking beyond the LHC

CClhc1_09_09

The LHC at CERN is about to start the direct exploration of physics at the tera-electron-volt energy scale. Early ground-breaking discoveries may be possible, with profound implications for our understanding of the fundamental forces and constituents of the universe, and for the future of the field of particle physics as a whole. These first results at the LHC will set the agenda for further possible colliders, which will be needed to study physics at the tera-electron-volt scale in closer detail.

Once the first inverse femtobarns of experimental data from the LHC have been analysed, the worldwide particle-physics community will need to converge on a strategy for shaping the field over the years to come. Given that the size and complexity of possible accelerator experiments will require a long construction time, the decision of when and how to go ahead with a future major facility needs to be undertaken in a timely fashion. Several projects for future colliders are currently being developed and soon it may be necessary to set priorities between these options, informed by whatever the LHC reveals at the tera-electron-volt scale

The CERN Theory Institute “From the LHC to a Future Collider” reviewed the physics goals, capabilities and possible results coming from the LHC and studied how these relate to possible future collider programmes. Participants discussed recent physics developments and the near-term capabilities of the Tevatron, the LHC and other experiments, as well as the most effective ways to prepare for providing scientific input to plans for the future direction of the field. To achieve these goals, the programme of the institute centred on a number of questions. What have we learnt from data collected up to this point? What may we expect to know about the emerging new physics during the initial phase of LHC operation? What do we need to know from the LHC to plan future accelerators? What scientific strategies will be needed to advance from the planned LHC running to a future collider facility? To answer the last two questions, the participants looked at what to expect from the LHC with a specific early luminosity, namely 10 fb–1, for different scenarios for physics at the tera-electron-volt scale and investigated which strategy for future colliders would be appropriate in each of these scenarios. Figure 1 looks further ahead and indicates a possible luminosity profile for the LHC and its sensitivity to new physics scenarios to come.

Present and future

The institute’s efforts were organized into four broad working groups on signatures that might appear in the early LHC data. Their key considerations were the scientific benefits of various upgrades of the LHC compared with the feasibility and timing of other possible future colliders. Hence, the programme also included a series of presentations on present and future projects, one on each possible accelerator followed by a talk on the strong physics points. These included the Tevatron at Fermilab, the (s)LHC, the International Linear Collider (ILC), the LHeC, the Compact Linear Collider (CLIC) concept and a muon collider.

Working Group 1, which was charged with studying scenarios for the production of a Higgs boson, assessed the implications of the detection of a state with properties that are compatible with a Higgs boson, whether Standard Model (SM)-like or not. If nature has chosen an SM-like Higgs, then ATLAS and CMS are well placed to discover it with 10 fb–1 (assuming √s = 14 TeV, otherwise more luminosity may be needed) and measure its mass. However, measuring other characteristics (such as decay width, spin, CP properties, branching ratios, couplings) with an accuracy better than 20–30% would require another facility.

The ILC would provide e+e collisions with an energy of √s = 500 GeV (with an upgrade path to √s = 1 TeV). It would allow precise measurements of all of the quantum numbers and many couplings of the Higgs boson, in addition to precise determinations of its mass and width – thereby giving an almost complete profile of the particle. CLIC would allow e+e collisions at higher energies, with √s = 1–3 TeV, and if the Higgs boson is relatively light it could give access to more of the rare decay modes. CLIC could also measure the Higgs self-couplings over a large range of the Higgs mass and study directly any resonance up to 2.5 TeV in mass in WW scattering.

Working Group 2 considered scenarios in which the first 10 fb–1 of LHC data fail to reveal a state with properties that are compatible with a Higgs boson. It reviewed complementary physics scenarios such as gauge boson self-couplings, longitudinal vector-boson scattering, exotic Higgs scenarios and scenarios with invisible Higgs decays. Two generic scenarios need to be considered in this context: those in which a Higgs exists but is difficult to see and those in which no Higgs exists at all. With higher LHC luminosity – for instance with the sLHC, an upgrade that gives 10 times more luminosity – it should be possible in many scenarios to determine whether or not a Higgs boson exists by improving the sensitivity to the production and decays of Higgs-like particles or vector resonances, for example, or by measuring WW scattering. The ILC would enable precision measurements of even the most difficult-to-see Higgs bosons, as would CLIC. The latter would be also good for producing heavy resonances.

Working Group 3 reviewed missing-energy signatures at the LHC, using supersymmetry as a representative model. The signals studied included events with leptons and jets, with the view of measuring the masses, spins and quantum numbers of any new particles produced. Studies of the LHC capabilities at √s = 14 TeV show that with 1 fb–1 of LHC luminosity, signals of missing energy with one or more additional leptons would give sensitivity to a large range of supersymmetric mass scales. In all of the missing-energy scenarios studied, early LHC data would provide important input for the technical and theoretical requirements for future linear-collider physics. These include the detector capabilities where, for example, the resolution of mass degeneracies could require exceptionally good energy resolution for jets, running scenarios, required threshold scans and upgrade options – for a γγ collider, for instance, and/or an e+e collider operating in “GigaZ” mode at the Z mass. The link with dark matter was also explored in this group.

CClhc2_09_09

Working Group 4 studied examples of phenomena that do not involve a missing-energy signature, such as the production of a new Z’ boson, other leptonic resonances, the impact of new physics on observables in the flavour sector, gravity signatures at the tera-electron-volt scale and other exotic signatures of new physics. The sLHC luminosity upgrade has the capability to provide additional crucial information on new physics discovered during early LHC running, as well as to increase the search sensitivity. On the other hand, a future linear collider – with its clean environment, known initial state and polarized beams – is unparalleled in terms of its abilities to conduct ultraprecise measurements of new and SM phenomena, provided that the new-physics scale is within reach of the machine. For example, in the case of a Z’, high-precision measurements at a future linear collider would provide a mass reach that is more than 10 times higher than the centre-of-mass energy of the linear collider itself. Attention was also given to the possibility of injecting a high-energy electron beam onto the LHC proton beam to provide an electron–proton collider, the LHeC. Certain phenomena such as the properties of leptoquarks could be studied particularly well with such a collider; for other scenarios, such as new heavy gauge-boson scattering, the LHeC can contribute crucial information on the couplings, which are not accessible with the LHC alone.

The physics capabilities of the sLHC, the ILC and CLIC are relatively well understood but will need refinement in the light of initial LHC running. In cases where the exploration of new physics might be challenging at the early LHC, synergy with a linear collider could be beneficial. In particular, a staged approach to linear-collider energies could prove promising.

The purpose of this CERN Theory Institute was to provide the particle-physics community with some tools for setting priorities among the future options at the appropriate time. Novel results from the early LHC data will open exciting prospects for particle physics, to be continued by a new major facility. In order to seize this opportunity, the particle-physics community will need to unite behind convincing and scientifically solid motivations for such a facility. The institute provided a framework for discussions now, before the actual LHC results start to come in, on how this could be achieved. In this context, the workshop report was also mentioned and made available to the European Strategy Session of the CERN Council meetings in September 2009. We now look forward to the first multi-tera-electron-volt collisions in the LHC, as well as to the harvest of new physics that these results will provide.

• For more about the institute, see http://indico.cern.ch/conferenceDisplay.py?confId=40437. The institute summary is available at http://arxiv.org/abs/0909.3240.

US industry-built ILC cavity reaches 41 MV/m

CCnew1_09_09

For the first time, a US-industry-made superconducting radiofrequency (SRF) cavity has reached and exceeded the accelerating gradient required for the envisioned International Linear Collider (ILC). The cavity achieved 41 MV/m at the ILC’s superconducting operating temperature of 2 K, thus far exceeding the specification of the ILC Global Design Effort (GDE) of 35 MV/m. The ILC would require about 16,000 such cavities.

CCnew2_09_09

Advanced Energy Systems Inc (AES) in Medford, New York, built the hollow niobium accelerating structure. A team at the Jefferson Lab processed it by electropolishing and then tested it as part of R&D funded by the US Department of Energy. In addition, they tested seven more AES cavities, one of which reached 34 MV/m, close to the specification. Several other North American companies are also attempting to manufacture ILC test cavities.

Jefferson Lab’s Rongli Geng, leader of the GDE Cavity Group, characterizes the 41 MV/m result as “remarkable”. He believes that it may be attributable to improvements in cavity treatment specific to AES cavities, which are aimed at optimizing the properties of the materials. Such optimization provides opportunities to attack the performance limitations of SRF cavities and improve the production yield in a realm other than processing and fabrication.

One such opportunity may have appeared during Jefferson Lab’s testing of AES cavities in conjunction with the heat treatment that removes hydrogen from cavity surfaces. Both the successful cavity and the one that was nearly successful underwent quicker, hotter heat treatment than had previously been standard: 2 hours at 800 °C instead of 10 hours at 600 °C. Because the AES-built cavities appeared to be stiffer, the revised treatment temperature primarily targeted the optimization of mechanical properties. However, because other improvements in material properties might also have occurred, the team at Jefferson Lab is conducting further investigations.

New temperature-mapping and optical-inspection tools adopted about a year ago under the guidance of ILC GDE project managers may also help to overcome the performance limitations of SRF cavities and improve the mass-production yield. “T-mapping” of cavity outer surfaces involves strategically placing thermal sensors to provide vital information about excessive heating in defective regions up to the point of local breakdown of superconductivity that causes a cavity to quench. This diagnostic procedure works in conjunction with the optical inspection of the surfaces within a cavity, which involves a mirror and a long-distance (around 1 m) microscope that together afford detailed mirror-reflected views of defective regions magnified at scales of about 0.1–1 mm.

First ions for ALICE and rings for LHCb

CCnew3_09_09

Injection tests on 25–29 September delivered heavy ions for the first time to the threshold of the LHC. Particles were extracted from the Super Proton Synchrotron (SPS) and transported along the TI2 and TI8 transfer lines towards the LHC, before being dumped on beam stoppers. These crucial tests not only showed that the whole injection chain performs well but they were also interesting for the ALICE collaboration because they included bunches of lead ions. By using a dedicated “beam injection” trigger, the ALICE detector registered bursts of particles emerging from the beam stopper at the end of the TI2 transfer line, some 300 m upstream of the detector, shedding light on the timing of the trigger.

CCnew4_09_09

While the LHC has undergone repairs and consolidation work since the incident that brought commissioning to an abrupt end in September 2008, the ALICE collaboration has been busy with important installation work, which has included the first modules of the electromagnetic calorimeter. This allowed the start in August of a full detector run with cosmic rays, which was scheduled to last until the end of October. In addition to trigger information from the silicon pixel and ACORDE detectors (the latter built specially for triggering on cosmic muons) ALICE is now making extensive use of the trigger provided by its time-of-flight array (TOF). The high granularity and the low noise (0.1 Hz/cm2) of the multigap resistive-plate chambers of the TOF, combined with the large coverage (around 150 m2), offers a range of trigger combinations.

More than 100 million cosmic events had been accumulated in the central detectors by early October, both with and without magnetic field. Even the forward muon system – oriented parallel to the LHC beam – has collected several tens of thousands of the very rare quasi-horizontal cosmic rays, which traverse the full length of the spectrometer at a rate of one particle every couple of minutes.

Near-horizontal cosmic rays are also valuable for checking out the LHCb detector, which is aligned along the LHC beam line, and they recently allowed observation of the first rings from the one of the two ring-imaging Cherenkov detectors, RICH1. There are two types of radiating material in RICH1: aerogel for lowest momentum particles (around a few GeV/c) and perfluoro-n-butane (C4F10) to cover momenta from 10 GeV/c to around 65 GeV/c. This is the first time that the RICH detector has seen a particle as it will once the LHC re-starts.

The shutdown of the LHC has also provided the opportunity for the LHCb collaboration to finish the detector completely, with the installation of the fifth and final plane of muon chambers. Other improvements include modifications to reduce noise in the electromagnetic calorimeter to a negligible level and network upgrades. During a recent commissioning week, in preparation for the LHC re-start, the LHCb team managed to read out the full detector at a rate of almost 1 MHz. Data packets were sent at 100 kHz through to the LHCb computer farm and each sub-detector was tested to ensure that the system could handle data at this rate.

bright-rec iop pub iop-science physcis connect