After some 180 days of running and 4 × 1014 proton–proton collisions, the LHC’s 2011 proton run came to an end at 5.15 p.m. on 30 October. For the second year running, the LHC team has largely surpassed its operational objectives, steadily increasing the rate at which the LHC has delivered data to the experiments.
At the beginning of the year’s run, the objective was to deliver an integrated luminosity of 1 fb–1 during the course of 2011. This came on 17 June, setting the experiments up well for the major physics conferences of the summer and requiring the 2011 data objective to be revised upwards to 5 fb–1. That milestone was passed by 18 October; when proton running ended, the LHC had delivered around 5.6 fb–1 to both the ATLAS and CMS experiments, 1.2 fb–1 to LHCb and 5 pb–1 to ALICE. Physics highlights for these four big experiments include closing down the space available for the long sought Higgs and supersymmetric particles to hide in, putting the Standard Model of particle physics through increasingly gruelling tests and advancing our understanding of the primordial universe.
“At the end of this year’s proton running, the LHC is reaching cruising speed,” comments CERN’s director for accelerators and technology, Steve Myers. “To put things in context, the present data-production rate is a factor of 4 million higher than in the first run in 2010 and a factor of 30 higher than at the beginning of 2011.”
Time has also been devoted to some special physics runs for the smaller TOTEM and ALFA experiments, which probe small-angle (forward) scattering, allowing them to measure the total proton–proton cross-section and the absolute luminosity calibration. In these runs, the beam is de-squeezed to a β* of 90 m in ATLAS and CMS. This is instead of the usual 1m β*, and gives a larger beam size at the interaction points – resulting in a reduced beam divergence there.
A number of factors have contributed to these impressive totals, including: the increase in the total number of bunches to 1380 during the first part of the year, the high bunch intensity and small beam sizes delivered by the LHC injectors, and the good aperture in the regions around ATLAS and CMS, which have allowed a squeeze to β* = 1 m. About 25% of the programmed physics time was spent with stable beams – which is not bad at this stage in the LHC’s career, given its complexity and the operation with high-intensity beams.
Following the end of proton running, a week of machine development began. An early high point was the cohabitation of protons and lead ions in the LHC – low-intensity beams of protons (clockwise) and lead ions (anti-clockwise) were successfully injected and ramped together. A first test of proton–lead collisions was scheduled to follow after commissioning for the lead-ion run and a 5-day technical stop. If successful, these tests will lead to a new strand of LHC operation, using protons to probe the internal structure of the much more massive lead ions.
As in 2010, the main goal before the end of year, however, is a four-week period of lead-ion running before the machine closes down for the winter technical stop.
It has now been a full publishing year for the new-look design of CERN Courier. The dynamic layout, features and wide-ranging articles in 2011, as well as the lively covers, have all been well received by readers in the particle-physics community.
Now we would like your help to make CERN Courier magazine and the preview-courier.web.cern.ch website even better. This is your chance to tell us what you think and give us your suggestions and wish lists.
On the night of 11–12 October, just a few hours after installation of its camera, the First G-APD Cherenkov Telescope (FACT) recorded flashes of Cherenkov light from air showers induced by cosmic rays. Remarkably, the shower images were recorded during a full moon – a feat that would not have been possible with a conventional air Cherenkov telescope.
FACT, installed at an altitude of 2200 m at the Roque de los Muchachos Observatory on La Palma, in the Canary Islands, uses newly developed Geiger-mode avalanche photo-diodes (G-APDs) instead of the photomultiplier tubes (PMTs) normally used in Cherenkov telescopes. These first images, taken in ambient light 100 times brighter than PMT-based telescopes could tolerate, demonstrate for the first time the use of silicon detectors capable of recording images at a rate of 109 a second.
The pioneering camera was designed and built by a collaboration from the universities of Dortmund, Geneva and Würzburg as well as EPF Lausanne (EPFL) and led by ETH Zurich. It consists of 1440 G-APDs, each one a square with a side of only 3 mm. To increase the active area, the collaboration developed solid light concentrators together with the University of Zurich. Each concentrator has a hexagonal entrance window 9.5 mm across and a square exit window with a side of 2.8 mm to match onto a G-APD. The result is a 10-fold concentration area for light reflected from the telescope mirror, while at the same time rejecting background light from outside the area of the mirror. There is one concentrator glued to each G-APD, providing a field of view of 0.1° per pixel and 4.5° for the full camera.
The electronics to read each of the 1440 pixels individually is based on the DRS-4 analogue ring sampler chip operating at a frequency of 2 gigasamples/s. The complete electronics package is integrated into the camera body, and data are sent to the counting house via standard Ethernet. The complete camera weighs about 150 kg and has a power consumption of around 500 W.
The camera was assembled and tested at ETH Zurich, before being installed in the refurbished HEGRA CT3 telescope at the Roque de los Muchachos Observatory, next to the MAGIC telescopes. The telescope, which has a total mirror area of 9.5 m2 , was equipped with a new drive system and improved mirror facets.
The installation of the camera is the first step towards establishing a monitor telescope for variable gamma-ray sources. It has already begun to demonstrate that G-APDs are a viable alternative to PMTs in Cherenkov telescopes. Future developments with these devices promise even higher photon detection efficiencies and availability at lower costs than PMTs. Moreover, their bias voltages of about 70 V render their operation under the harsh conditions of Cherenkov telescope sites stable and robust.
The LHCb experiment has had a remarkable year, moving from first results to world-beating measurements of B-hadron properties, such as the oscillation frequency of the Bs meson, CP and forward-backward asymmetries, as well as limits on rare decays, for example Bs → μ+μ–. Even though the physics harvest is now in full flow, the collaboration is already planning for the eventual upgrade of the experiment, which is scheduled to be ready for data-taking in 2019.
The instantaneous luminosity delivered to LHCb has steadily increased throughout the year, reaching 4 × 1032 cm2 s–1 by the end of the run, already twice the original design luminosity for the experiment. Unlike the general-purpose detectors at the LHC, ATLAS and CMS, LHCb has been specifically designed for the optimal study of B hadrons, covering an angular range of 10–300 mrad from the beam axis (the forward region). This gives it different constraints concerning the luminosity. The track density increases in this region, so detectors suffer from higher occupancy, and are potentially more prone to radiation damage. In addition, because the experiment is tuned for the precise study of B-hadron decay vertices, too many overlapping events can confuse the picture. Finally, the experiment’s trigger has the special feature that it can select fully hadronic decays, rather than only relying on electron or muon signatures, and this trigger cannot handle too high an input rate without reducing its efficiency.
As a result, the luminosity cannot be pushed much higher in the current experiment. This has the positive aspect that next year should be one of continuous operation with the experiment in stable conditions, but eventually it means that the time taken to double the data-set will become long. The goal for this year was 1 fb–1 of integrated luminosity, which (thanks to the excellent performance of the LHC) was comfortably passed with a few weeks to spare; it represents more than 30 times as much data as last year. The expectation is to at least double that sample again in 2012, but for the longer term the collaboration plans to upgrade the experiment so that it can operate at higher luminosity and accumulate an order of magnitude more data. This will allow even higher precision in the search for new physics in the flavour sector.
The key to the upgrade will be to read out the full experiment at 40 MHz, the design bunch-crossing rate of the LHC, and to perform the trigger in software in a powerful computer farm. For this to succeed, collisions will indeed have to be provided by the LHC at 40 MHz at the time of the upgrade, rather than at the current rate of 20 MHz.
The LHCb Collaboration submitted a Letter of Intent describing the proposed upgrade to the LHC Committee (LHCC) in March, and the committee endorsed the physics programme. A review panel looked into the proposed 40 MHz readout scheme and gave a positive report, so that the LHCC has now encouraged LHCb to proceed with preparing Technical Design Reports for the upgrade components. This will ensure the future of the experiment into the next decade.
Upgrades to the B-factory experiments are under consideration in Japan and Italy on the same timescale, and have a complementary reach for this physics. While they have strong performance for neutral decay products, they cannot compete with the enormous production rate at the LHC for charged modes, and the time-dependent study of Bs states will remain the province of LHCb. The upgrade will also allow the LHCb experiment to act as a general-purpose detector in the forward region, with the ability to search for exotic particles that might give long decay lengths, or to study in detail the influence of any new physics states that might be discovered at the LHC over the same period. The collaboration is now pressing ahead with the R&D necessary to ensure the upgrade’s success.
• For more information see Letter of Intent for the LHCb Upgrade, CERN-LHCC-2011-001.
The ALICE collaboration has measured the production of baryons containing two or three strange quarks in lead–lead collisions at the LHC, at an energy of 2.76 TeV per nucleon pair, nearly 14 times larger than that obtained previously at Brookhaven’s Relativistic Heavy Ion Collider (RHIC). The yields and transverse-momentum spectra of multi-strange baryons and antibaryons in heavy-ion collisions are important in characterizing the evolution of the hot-matter created, as it passes from the strange quarks and antiquarks of the early partonic stages to the subsequent hadronization.
The collaboration identified multi-strange baryons mainly by a topological method, looking for their weak-decay products originating from secondary vertices well separated from the main interaction vertex. The researchers also exploited particle identification via specific energy loss in the time projection chamber (TPC). For example, the Ω– baryon (consisting of three strange quarks) decays into a negative K meson and a neutral Λ baryon, which in turn decays into a proton and a negative π meson. A peak in the invariant mass spectrum of all (Λ, K–) combinations provides a clearly identifiable signal (figure 1).
Good momentum resolution and a precise secondary-vertex reconstruction were essential for this result. A key element was the excellent performance of the main tracking detectors in ALICE’s central barrel – the TPC and the internal tracking system (ITS) – in the challenging environment of central (head-on) lead–lead collisions.
The data were analysed in transverse momentum intervals up to 6–9 GeV/c for the doubly-strange baryons (Ξ– and charge conjugate separately) and 6–8 GeV/c for the triply-strange baryons (Ω– and charge conjugate separately), made possible by the examination of 30 million minimum-bias nuclear interaction candidate events. In addition, the centrality of the selected events was determined from signals collected in two scintillator hodoscopes at backward and forward rapidities (K Aamodt et al. 2011). This allowed the analysis to be repeated in four different centrality intervals, from the 0–20% most central (almost head-on) collisions to the 60–90% peripheral collisions, in order to compare with previous results at RHIC. Figure 2 shows the resulting transverse-momentum spectra, fully corrected for detector acceptance and efficiency; it also shows clearly how multi-strange baryon production increases with the centrality of the collision at LHC energy. The results were presented at the recent conference on Strangeness in Quark Matter (Strangeness and heavy flavours in Krakow).
Previous experiments at CERN’s Super Proton Synchrotron (SPS) and at RHIC obtained multi-strange baryon spectra and yields in 17 GeV lead–lead and 200 GeV gold–gold collisions, respectively. The ALICE experiment not only finds higher yields in lead–lead collisions at the LHC energy, but also finds that the enhancement with respect to proton–proton collisions is greater for the Ω than for the Ξ, confirming the trend observed at both SPS and RHIC. Moreover, the enhancement with respect to proton–proton data increases with the centrality of the collision, in a similar way to previous observations.
Supersymmetry (SUSY) is still one of the strong candidates for physics beyond the Standard Model that could be detected in proton–proton collisions at the LHC. It could solve many of the outstanding issues in particle physics, such as the gauge hierarchy problem. SUSY can reveal itself through the production of new heavy particles and could therefore deliver a natural candidate particle to explain the large density of dark matter in the universe. However, it has so far evaded the current searches in both the CMS and ATLAS experiments.
The figure shows a compilation of many of the most recent public results of CMS for integrated luminosities of about 1–2 fb–1. It illustrates the reach of the analyses with respect to pre-LHC experiments in the plane of the universal scalar and gaugino masses (m0 and m1/2, respectively) at the grand unified theory scale of the constrained minimal supersymmetric extension of the Standard Model (CMSSM).
A large increase in sensitivity was clearly obtained at the LHC with the data analysed, obtained in 2010 and until August 2011, but this parameter space is just one reference point among possible SUSY scenarios. Additional data will allow the exploration of other scenarios where each of the signatures, from no-leptons to multileptons, may have the most sensitivity. The search channels shown varied from having a few to many jets (αT, Jets+MHT, MT2), and jets plus one lepton (i.e. generally one muon or electron), jets plus two leptons, with either opposite (OS) or the same (SS) charge. All of these channels were also required to have a large missing transverse energy. The latter is a key characteristic of many SUSY searches, reflecting the supposition that the lightest SUSY particle is expected to be neutral, stable and weakly interacting – thereby escaping detection.
CMS recently released the results of SUSY searches for candidates containing at least three leptons. For these search channels, the Standard Model background is low, mostly di-boson events; this allows the missing transverse energy requirement to be relaxed considerably and so provide sensitivity to SUSY models with so-called R-parity violation. In these models, SUSY particles decay to Standard Model particles and no dark-matter candidate can escape detection. Moreover, such studies are sensitive to the channel of direct electroweak gaugino production, important in scenarios that conserve R-parity.
CMS analysed a total of 2.1 fb–1 of data. In general no significant excess was observed in this new analysis – so SUSY, if it exits, still manages to hide away. As many as 52 different channels have now been looked into and although a few of them show a slight excess over the background estimated from data, all of them currently have a significance of less than 2σ. CMS will certainly continue to “watch this space”.
At the time of this writing, more than 5 fb–1 of data have been recorded and are now being analysed. It promises to be an interesting winter for SUSY searches.
Searches for dilepton resonances have a history of discoveries, from the J/ψ and Υ to the Z boson. Now new neutral gauge bosons, Z’, which would appear as resonances, are predicted by a number of theories. They are the mediators of new forces that allow the unification of all fundamental forces at some very large energy scale. Dilepton resonances are also predicted as gravitons in models of extra-dimensional gravity.
The analysis by ATLAS used a data sample corresponding to an integrated luminosity of 1.1 fb–1. The sensitivity to new physics extends to 1.8 TeV, similar to that of recent preliminary results from CMS and far beyond the limits achieved at lower-energy accelerators. The observed dilepton mass distributions (for example, the di-electron distribution of figure 2) are in good agreement with the spectrum predicted by the Standard Model including higher-order QCD and electroweak corrections.
The search technique employed by ATLAS involves the comparison of the dilepton mass distribution with the predicted spectrum over the entire high-mass range. The prediction includes a series of hypothetical resonance line-shapes with different masses and couplings. The dominant sources of systematic uncertainty are of a theoretical nature, arising from the calculations of the production rates.
The ATLAS detector will measure the mass of any resonance observed quite accurately. The liquid argon calorimeter provides a linear and stable response for electrons up to the highest energy, and the combination of the inner detector and the muon spectrometer provides muon measurement at the highest momenta. ATLAS will also measure the cross-section, couplings, spin and interference properties of a resonance.
Work is ongoing to increase the lepton acceptance further, and ATLAS will extend the kinematic reach of these exciting measurements with much larger datasets in 2011–2012.
A new component of gravity, the scalar gravitational field, may explain the mechanism that allows the immense explosions of type II supernovae to take place. However, this could happen only through a dynamic process – parametric instability – that dates back to work by Lord Rayleigh in the 1880s.
When the central core of a massive star runs out of nuclear fuel (having been converted mainly into iron), it collapses under its own weight in less than a second into an extremely dense neutron star, releasing an enormous amount of gravitational energy. A supernova results, but only a small fraction of the total energy released appears as electromagnetic radiation (light) of the “new star”. The kinetic energy of the exploding stellar envelope is 10 times greater, but the greatest part of the energy by far is carried away by neutrinos, which can more easily escape the dense material of the core.
Detection of neutrinos from supernova SN1987A did much to verify this picture. During core-collapse, the density at the centre of the star rapidly increases, finally forming dense nuclear matter that is extremely difficult to compress. The collapsing material rebounds from this nuclear matter, resulting in an outgoing pressure wave, which soon becomes a huge shock wave.
Extensive studies have attempted to decide whether this “prompt shock” travels all of the way out and ejects the outer part of the star. Indeed, simulations suggest that it stalls at distances of about 300 km from the centre because of the immense energy required to dissociate iron and other nuclei. However, further simulations have found that the shock could restart if the electrons could absorb about 1% of the energy carried by neutrinos. In the neutrino-plasma coupling model, collective interactions between the neutrinos and the plasma could initiate the required energy transfer. Alternatively, recent research suggests that the solution to re-energizing the shock may lie in a fundamental field that takes the simple form of a scalar (like the Higgs field).
Gravity containing a scalar field (originally proposed by Carl Brans and Robert Dicke in the 1960s as an additional component of the gravitational field) has been considered as a promising extension to Einstein’s general relativity in connection with quantum gravity and grand unification. The theory of Brans and Dicke was based on a relatively simple linear coupling to the scalar gravitational field. A few years ago, this linear coupling was shown to be negligible, using radio signals transmitted from the Cassini spacecraft when it was near Saturn.
Now, researchers in the UK and Portugal have analysed the nonlinear coupling to a scalar gravitational field. They find that under extreme conditions with strong time-varying gravity such as may be found in the interior of a newly born neutron star, the scalar gravitational field may be stimulated via parametric instability. The resulting emission of scalar gravitational waves from the neutron core of a collapsing heavy star may be sufficient to re-energize the stalled shock, thus providing a 19th-century solution to a 20th-century problem.
The Crab nebula is the brightest persistent gamma-ray source in the sky with radiation detected up to very high energies (VHE), above 100 GeV. The surprising result of the Very Energetic Radiation Imaging Telescope Array System (VERITAS) is the detection of pulsed radiation from the Crab in this extreme energy range, a new challenge for theorists.
VERITAS is an array of four 12-m telescopes using the imaging air Cherenkov technique (IACT) to detect VHE gamma-rays. Located in Arizona, it is the American equivalent of two European Cherenkov telescope facilities: the Major Atmospheric Gamma-ray Imaging Cherenkov telescope (MAGIC) on the Canary Islands and the High Energy Stereoscopic System (HESS) (CERN Courier January/February 2005 p30 and June 2009 p20). The IACT uses the Earth’s atmosphere as a gamma-ray detector where VHE photons interact with atomic nuclei to produce electron–positron pairs, which create a cascade of charged particles. The latter can travel faster than light in the air, thus producing a flash of Cherenkov radiation in the optical-UV range that is recorded by the telescopes on the ground. The telescope’s camera is an array of photomultipliers that produce an image of the particle shower, which develops at an altitude of 10–20 km and retains the information on the direction and energy of the incoming gamma-ray photon.
The Crab nebula is the remnant of a supernova explosion witnessed by Chinese and Arab astronomers in 1054 AD and its pulsed emission at radio frequencies was discovered in 1969. The regular pulses revealed the presence of a strongly magnetized neutron star – the collapsed core of the defunct star only about 30 km across – rotating 30 times per second. The same periodic signal from the pulsar was subsequently detected in the optical, X-ray and gamma-ray domain. In 2008, MAGIC was the first Cherenkov telescope to detect pulsed radiation from the Crab above an energy of 25 GeV. It was, however, generally thought that the gamma-ray emission spectrum of the pulsar would cut off at higher energies preventing a detection with current instrumentation above 100 GeV.
As VERITAS has a higher energy- threshold than MAGIC, there was only a faint hope that a long observation of the Crab nebula could provide a meaningful detection. Nevertheless, with 107 hours of observations between September 2007 and March 2011 the challenge paid off and the VERITAS collaboration has reported a 6σ detection of the VHE pulsed emission from the Crab. The observed spectrum published in the Science covers the 100–400 GeV range at a flux of about 1% that of the nebula at 150 GeV. Combining the VERITAS data with the lower-energy measurements by MAGIC and the Fermi gamma-ray space telescope provides a broad spectrum extending from 100 MeV to 400 GeV. The new observations clearly disfavour a power-law model with an exponential cut-off. Instead they suggest that the pulsed gamma-ray emission from the Crab is a broken power-law with a break at a photon energy of about 4 GeV.
The unexpected VHE observations are extremely difficult to reconcile with emission from the magnetic poles of the neutron star. Indeed, the authors of the paper find that the highest energy gamma-rays they measure can only be emitted at a distance of more than 10 stellar radii from the neutron star’s surface. They also find that curvature radiation – the emission of electrons following curved magnetic field lines – is unlikely to produce gamma rays above 100 GeV. A plausible alternative could be inverse-Compton scattering, but the theory is not yet able to define clearly where such a process could take place. The fact that the pulses observed by VERITAS are narrower by a factor two to three compared with those observed by Fermi at 100 MeV and that the secondary pulse becomes the dominant one in the VERITAS energy range could be interesting clues for theorists.
Tous les trois ans, c’est l’Europe qui accueille la Conférence internationale sur les accélérateurs de particules (IPAC), qui a lieu tous les ans ; l’Asie et l’Amérique prennent le relais les autres années. IPAC’11, deuxième conférence de la série, a attiré quelque 1100 participants d’environ 30 pays, dont environ la moitié venait d’Europe et la moitié d’Asie et d’Amérique du Nord. À cette occasion, ils ont pu échanger sur différents sujets, tels que les sources de lumière synchrotron, les lasers à électron libre, les accélérateurs de hadrons, les collisionneurs et les applications des accélérateurs.
Three years ago the last in the series of European Particle Accelerator Conferences took place in the Italian seaport of Genoa. Now, reflecting the increasingly worldwide nature of the subject, Europe plays host every three years to the new International Particle Accelerator Conference (IPAC) series, with Asia and North America taking turns in the other two years. Following a successful first meeting in Kyoto in 2010, this year it fell to Europe to organize the 2nd IPAC and once again accelerator specialists found themselves by the sea, this time at San Sebastián on Spain’s Atlantic coast. There, on 4–9 September, the sea and surf provided a fitting setting for a meeting where wave motion figured in many presentations.
The Kursaal conference centre on the water’s edge was an ideal place to house the sessions for around 1100 participants. As at many conferences these consist of plenary, parallel and poster sessions but the balance at IPAC – as at the regional conferences held in the past – makes for a different mixture. A pair of parallel sessions provides a straight choice between two oral presentations, allowing much of the “business” of the conference to occur in busy afternoon poster sessions, which cover different topics each day. These are like scientific bazaars where vendors show off their wares to anyone interested and information is traded freely.
IPAC’11 was truly international, with participants from some 30 countries, around half from Europe and the other half from Asia and North America. The programme spanned four and a half days, with plenary sessions on two mornings, as well as on one afternoon. The two parallel sessions took place on other mornings and afternoons, with the poster sessions (supplied with plenty of refreshments) scheduled alone at the end of four afternoons. Altogether, 1300 posters were scheduled, which is a clear indication of the vibrancy of the field. A special poster session for young scientists – with a prize for the best two – took place on the first day during registration (see box).
At the opening plenary session the Spanish minister of science and innovation, Cristina Garmendia, welcomed participants to the Basque country – her home region – and drew attention to Spain’s increasing involvement in accelerator science. ALBA, the synchrotron light source near Barcelona, was the topic of the first of the session of invited talks, which went on to highlight a leading facility in the areas of hadron accelerators (the Japan Proton Accelerator Research Complex, J-PARC), circular colliders (the LHC), linear colliders (the International Linear Collider, ILC) and synchrotron light sources and FELs (the Japanese-XFEL at SPring-8). These were four of the main themes for the conference, the others being accelerator technology, beam instrumentation and feedback, beam dynamics and electromagnetic fields, as well as applications of accelerators.
The presentations on light sources served to demonstrate just how far and wide accelerator technology has spread to provide valuable research tools for other areas of science. ALBA, built and operated by the CELLS consortium, is based on a 3 GeV booster synchrotron that feeds a storage ring located in the same tunnel (CERN Courier November 2008 p31). The storage ring accumulated first beam on 16 March and by 7 June had achieved a current of 170 mA in readiness for beamline commissioning. Other similar synchrotron-based facilities that are now being designed elsewhere include the Polish facility Solaris and the Iranian Light Source Facility. The design of Sirius, at the Brazilian Synchrotron Light Laboratory, São Paulo, takes a slightly different approach. The dipoles of the 3 GeV ring will be based on permanent magnet technology – for reduced operational costs and increased reliability – and it will combine low-field magnets (0.5 T) for guiding the main beam with short “slices” of high-field magnets (2.0 T) to generate 12 keV photons for the beamlines.
Synchrotron light sources are now being joined increasingly by facilities that use FELs to produce light at ever decreasing wavelengths. In Italy, for example, the 2.4 GeV Elettra synchrotron light source is being supplemented by FERMI, an externally seeded soft X-ray FEL. Its first seeded FEL coherent output at 43 nm, achieved last December, was followed by commissioning of the beamlines and the first tests with users in March and April. In Japan, the SPring-8 Angstrom Compact free electron LAser (SACLA) is the world’s first compact X-ray FEL based on in-vacuum undulators. Commissioning started there in February and the FEL produced its first X-ray laser light at 0.12 nm on 7 June (CERN Courier July/August 2011 p9). In the UK, meanwhile, the ALICE Accelerator R&D Facility at Daresbury, has built Europe’s first FEL to be driven by a superconducting energy-recovery linac. The infrared FEL achieved lasing for the first time in October 2010. ALICE is also a source of high-power terahertz radiation, which has potential use in studies of living cells. In June, beam was transferred to a tissue-culture laboratory some 30 m from the accelerator.
Hadron power
While electron accelerators find applications in light sources, hadron accelerators have a role as powerful drivers for a range of facilities. J-PARC, for example, is a multipurpose proton accelerator facility aiming ultimately at an output beam power in the megawatt range. Commissioning started with the 400 MeV H– linac in November 2006 and by 2011 output beam power from the 3 GeV Rapid-Cycling Synchrotron to the Materials and Life Science Facility had reached 220 kW and a 145 kW beam was produced by fast extraction from the Main Ring for neutrino production. This progress came to an abrupt halt in March when the massive earthquake struck the north-east of Japan. With its epicentre 270 km from J-PARC, the magnitude 9.0 event caused subsidence of up to 1.5 m at the entrance to the linac but – although the tunnel floor was deformed by as much as 45 mm – the linac structures did not topple. Simulations have shown that misalignment of up to 0.2 mm in the drift-tube quadrupoles can be tolerated by compensating with beam steering. This is speeding up recovery because only the drift-tube structures with the most essential realignment need to be opened up. Beam tests following recovery are planned for December, with a restart in January 2012.
Spallation neutron sources require powerful proton accelerators to drive them. The China Spallation Neutron Source is being designed to operate with an initial proton beam power of 100 kW, upgradeable to 500 kW. Construction was scheduled to start in September and should take six and a half years. It is the first large-scale, high-power accelerator project to be constructed in China.
In Europe, there is a growing need for a new high-flux source of cold neutrons because many research reactors are scheduled to close in the coming decade. The European Spallation Neutron Source (ESS) to be built in Lund should fill this gap. In terms of proton energy, its requirements are modest – 2.5 GeV – but the challenges will come in providing the 5 MW beam power and the 50 mA current in 2.9 ms pulses at 14 Hz. As many as 17 countries are involved in the accelerator development, with a target date of 2020 for first operation.
High-power proton accelerators are equally important for future high-intensity neutrino sources. In Europe the EC has funded EUROnu, a design study to investigate three options for a future high-intensity neutrino oscillation facility: a CERN to Frejus Super-Beam, a Neutrino Factory and a Beta Beam. At CERN, meanwhile, Linac 4 represents the first step in a long-term programme to increase luminosity in the LHC. Civil engineering for a new building has recently finished and construction of the main accelerator components for the 160 MeV H– linac has started with the support of a network of international collaborations. Linac 4 should double the intensity from the Proton Synchrotron Booster; commissioning is currently planned for 2013–2014.
Frontier colliders
At IPAC’11 the LHC had a starring role as a circular collider. It passed the peak luminosity record of the Tevatron – its predecessor as the world’s highest-energy collider – on 21 April and has since gone on to deliver more than 5 fb–1 integrated luminosity to the ATLAS and CMS experiments. As the world’s first superconducting accelerator, the Tevatron blazed an important pioneering trail, which was reviewed by Fermilab’s Vladimir Shiltsev after passing a symbolic baton to Mike Lamont, head of the Operations Group at CERN. While the spotlight is now on physics at the LHC, the accelerator experts are looking increasingly towards the future with upgrades to take the luminosity to an average of 5 × 1034 cm–2 s–1 in the High-Luminosity LHC project. The main goal is to reach 3000 fb–1 of accumulated luminosity in the 10–12 years following the upgrade.
One of the issues with increasing energy (and intensity) in hadron beams concerns collimation. Crystal collimation has been studied for a number of years at CERN. Most recently, tests by the UA9 experiment at the Super Proton Synchrotron show that halo collimation with bent crystals could help to enhance collimation efficiency at the LHC. Another novel idea uses a hollow low-energy electron beam in which the proton halo experiences non-linear transverse kicks. This requires a special electron gun to create the hollow beam; tests with mainly antiprotons have already taken place at Fermilab.
In parallel to the LHC luminosity upgrade, plans are being studied to transform the LHC into the LHeC – a high-luminosity electron–nucleon collider of 1.3 TeV centre-of-mass energy (CERN Courier April 2009 p22). This could be achieved by adding a 60 GeV electron ring or linear accelerator to the existing proton and ion facility; it would complement the LHC’s physics potential and build on the existing LHC investments and its high-luminosity upgrade.
Circular electron colliders have been at the intensity frontier for a number of years, in particle “factories”. With KEKB and PEP II reaching peak luminosities of 1034 cm–2 s–1 they delivered a combined integrated luminosity of more than 1 inverse attobarn (1 ab–1). Now the aim is to go 50 times higher, with two facilities, SuperB in Italy and SuperKEKB in Japan. Large crossing-angles are a common feature for increased luminosity. At lower energies, a technique using round beams is being developed to give higher luminosities at the VEPP2000 facility at the Budker Institute of Nuclear Physics in Novosibirsk.
The high-energy frontier with electron machines rests with a linear collider. The ILC and Compact Linear Collider (CLIC) design studies exemplify the necessary international approach (CERN Courier December 2010 p7). Excellent results have demonstrated the feasibility of the CLIC concept for a high-energy linear collider based on the two-beam scheme and normal conducting RF cavities. These concepts and results will be the basis for the CLIC Conceptual Design Report to be completed in 2012. Another technology that could be key, developed in the context of the ILC design study, is superconducting RF and a 9-cell cavity is undergoing intensive R&D around the world. As many as 14 560 such cavities would be required for an eventual machine, housed in 1680 cryomodules. So far some 12 cryomodules have been made, with one from KEK operating at more than 35 V/m in tests.
R&D for the ILC is having spin-off in other research areas, for example, in rare isotope production. This is typically an application for high-power proton beams, but at TRIUMF, the Advanced Rare Isotope Laboratory will couple proton-induced spallation with electron-driven photo-fission for the production of proton-rich isotopes below the line of stability. The latter requires an electron linac delivering 0.5 MW to give some 104 fissions per second and the plan is to use 9-cell 1.3 GHz cavities developed for the ILC.
Although the frontiers of fundamental particle physics may drive the push particularly to higher energy accelerators, IPAC is a showcase for the applied side of this research and its close links with industry. An industrial exhibition on the first three days of the conference allowed 76 companies to present their hi-tech products and services. A special session on “Spanish Science Industry” also took place one evening.
Hadron therapy is one high-profile application of particle accelerators that now has close links with industry (Hadron therapy: collaborating for the future). In the final session, invited speaker Koji Noda, of the National Institute of Radiological Sciences (NIRS) in Japan, reviewed accelerator and beam-delivery technologies for hadron-therapy facilities around the world. The number of facilities is set to grow from around 30 to more than 50 in the coming decade, thanks to the development of suitable technologies. These include gated irradiation linked to beam extraction, which can switch the beam on or off within a millisecond in response to breathing – one of several techniques pioneered in Japan. For the future more compact machines have the potential to reduce costs, for example, with laser-ion acceleration or with fixed-field alternating gradient (FFAG) machines. Another interesting development centres on the non-scaling FFAG concept, using permanent magnets for a compact proton accelerator.
In another talk that showed that the field has social awareness, Colin Carlile posed the question: “Is it possible to operate a large research facility with wind power?” The answer seems to be, “yes, to a certain degree”. As director-general of the ESS he is in a position to lead by example and showed schemes to reduce electricity consumption from 350 GW to 250 GW, with CO2 savings of 165,000 tonnes each year. A final, sobering note for the conference came from John Duncan, until recently the UK Ambassador for Arms Control and Disarmament, with his talk “Towards a World Without Nuclear Weapons: How can Scientists Help?”
The success of IPAC’11 was a result of the excellent collaboration between the international teams of the Organizing Committee and the Scientific Programme Committee, and the Local Organizing Committee. The large number of participants and the enthusiasm shown in San Sebastián indicate the strong mandate for the IPAC series from the worldwide accelerator community. This community will be looking forward to the next edition, which will take place in New Orleans on 20–25 May 2012.
• IPAC’11 was organized under the auspices of the European Physical Society Accelerator Group (EPS-AG) and the International Union of Pure and Applied Sciences (IUPAP). The proceedings of IPAC’11 are published on the JACoW site: www.jacow.org. Thanks to the JACoW team of seasoned experts and freshly trained volunteers, led by Christine Petit-Jean-Genaz of CERN, a pre-press version with 1236 contributions was published five days after the conference. The final version was published on the site just three weeks after the conference – yet another impressive record set by the JACoW collaboration.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.