The ALPHA collaboration at CERN has made the first direct analysis of how antimatter is affected by gravity. The ALPHA experiment was the first to trap atoms of antihydrogen, held in place with a strong magnetic field for up to 1000 s. Although the main goal is not to study gravity, the team realized that the data that they have collected might be sensitive to gravitational effects. Specifically, they searched for the free fall (or rise) of antihydrogen atoms released from the trap, which allowed them to measure limits directly on the ratio of the gravitational to inertial mass of antimatter, F=Mg/M.
Measuring a total of 434 atoms, they found that in the absence of systematic errors, F must be < 75 at a statistical significance level of 5%; the worst-case systematic errors increase this limit to < 110. A similar search places somewhat tighter bounds on a negative F, that is, on antigravity. Refinements of the technique, coupled with larger numbers of cold-trapped antiatoms, should allow future measurements to place tighter bounds on F and approach the interesting region around 1.
Meanwhile, the antimatter programme at CERN is expanding. AEgIS and GBAR, two experiments currently under construction, will focus on measuring how gravity affects antihydrogen.
Astronomical observations – such as the rotation velocities of galaxies and gravitational lensing – show that more than 80% of the matter in the universe remains invisible. Deciphering the nature of this “dark matter” remains one of the most interesting questions in particle physics and astronomy. The CMS collaboration recently conducted a search for the direct production of dark-matter particles (χ), with especially good sensitivity in the low-mass region that has generated much interest among scientists studying dark matter.
Possible hints of a particle that may be a candidate for dark matter have already begun to appear in the direct-detection experiments; most recently the CDMS-II collaboration reported the observation of three candidate events in its silicon detectors with an estimated background of 0.7 events. This result points to low masses, below 10 GeV/c2, as a region that should be particularly interesting to search. This mass region is where the direct-detection experiments start to lose sensitivity because they rely on measuring the recoil energy imparted to a nucleus by collisions with the dark-matter particles. For a low-mass χ, the kinetic energy transferred to the nucleus in the collision is small, and the detection sensitivity drops as a result.
The CMS collaboration has searched for hints of these elusive particles in “monojet” events, where the dark-matter particles escape undetected, yielding only “missing momentum” in the event. A jet of initial-state radiation can accompany the production of the dark-matter particles, so a search is conducted for an excess of these visible companions compared with the expectation from Standard Model processes. The results are then interpreted within the framework of a simple “effective” theory for their production, where the particle mediating the interaction is assumed to have high mass. An important aspect of the search by CMS is that there is no fall in sensitivity for low masses.
The monojet search requires at least one jet with more than 110 GeV of energy and has the best sensitivity if there is more than 400 GeV of missing momentum. Events with additional leptons or multiple jets are vetoed. After event selection, 3677 events were found in the recent analysis, with an expectation from Standard Model processes of 3663 ± 196 events. The contribution from electroweak processes dominate this expectation, either from pp → Z+jets with the Z decaying to two neutrinos or from pp → W+jets, where the W decays into a lepton and neutrino, while the lepton escapes detection.
With no significant deviation from the expectation from the Standard Model, CMS has set limits on the production of dark matter, as shown in the figures of the χ–nucleon cross-section versus χ mass. The limits show that CMS has good sensitivity in the low-mass regions of interest, for both spin-dependent and spin-independent interactions.
In March 2012, the LHCb collaboration reported an observation of CP violation in charged B-meson decays, B± → DK±. Now, just over a year later, the collaboration has announced a similar observation in the decays in another B meson, in this case the B0s meson composed of a beauty antiquark b bound with a strange quark s. This first observation of CP violation in the decays B0s → K–π+ with a significance of more than 5σ marks the first time that CP violation has been found in the decays of B0s mesons – only the fourth type of meson where this effect has been seen. It is an important milestone for LHCb because the precise study of B0s decays is sensitive to possible physics beyond the Standard Model.
The study of CP violation in charmless charged two-body B decays provides stringent tests of the Cabibbo-Kobayashi-Maskawa picture of CP violation in the Standard Model. However, the presence of hadronic contributions means that several measurements from such decays are needed to exploit flavour symmetries and disentangle the different contributions. In 2004, the BaBar and Belle collaborations at SLAC and KEK, respectively, discovered direct CP violation in the decay B0 → K+π– and a model-independent test was proposed to check the consistency of the observed size of the effect with the Standard Model. The test consists of comparing CP violation in B0 → K+π– with that in B0s → K–π+. The B factories at KEK and SLAC did not have the possibility of accumulating large enough samples of B0s decays and, despite much effort by the CDF collaboration at Fermilab’s Tevatron, CP violation had until now not been seen in B0s → K–π+ with a significance exceeding 5σ.
Using a data sample corresponding to an integrated luminosity of 1.0 fb–1 collected by the experiment in 2011, the LHCb collaboration measured the direct CP-violating asymmetry for B0s → K–π+ decays, ACP (B0s → K–π+) = 0.27 ± 0.04 (stat.) ± 0.01 (syst.), with a significance of more than 5σ. In addition, the collaboration improved the determination of direct CP violation in B0 → K+π– decays, ACP (B0 → K+π–) = –0.080 ± 0.007 (stat.) ± 0.003 (syst.), which is the most precise measurement of this quantity to date. The four plots in figure 2 show different components of the K+π– invariant mass. The upper plots indicate the well established difference in the decay rates of B0 mesons. The enlargements in the lower plots reveal that a difference is also visible around the mass of the B0s meson. The measured values are in good agreement with the Standard Model expectation.
Only the data sample collected in 2011 was used to obtain these results, so LHCb will improve the precision further with the total data set now available, which more than trebled with the excellent performance of the LHC during 2012.
When quarks and gluons (partons) in opposing beams at high-energy hadron colliders meet they can scatter violently to produce correlated showers of particles, or “jets”. In proton–proton (pp) collisions, the rate of such events can be predicted using state-of-the-art QCD calculations and compared with the measurements. However, in heavy-ion collisions, jets are expected to be modified by the interaction of the scattered partons with the surrounding excited nuclear matter – the quark–gluon plasma, or QGP. Jets and this phenomenon of “jet quenching” thus provide important diagnostic probes of the QGP.
ALICE, which is devoted to a broad study of the QGP at the LHC, first observed jet quenching in lead–lead (PbPb) collisions through the suppressed production rate of high-momentum single hadrons (CERN Courier June 2011 p17). Fully reconstructed jets are measured using the high-precision tracking of charged particles in the ALICE central barrel, together with a measurement of the energy of neutral particles in the EMCal (figure 1). The EMCal is a lead–scintillator sampling calorimeter covering |η| < 0.7 and 100° in azimuth, which consists of 11,520 separate towers, each subtending Δη × Δφ = 0.014 × 0.014. A late addition, its installation was completed in January 2011.
This “tracking+EMCal” method of reconstructing jets differs from the more traditional approach with hadronic plus electromagnetic calorimetry and provides a systematically different way to study jets. The first step for ALICE in the study of fully reconstructed jets in heavy-ion collisions is nevertheless to measure jets in the simpler environment of pp collisions, to determine the expected production rate for jets.
In March 2011, the LHC delivered a three-day run with pp collisions at √s =2.76 TeV to provide the reference measurements for PbPb collisions at the same centre-of-mass energy. During this brief run, the recently commissioned EMCal was employed as a fast trigger for jets, allowing ALICE to accumulate an integrated luminosity of 13.6 nb–1. Jet reconstruction was then carried out using two resolution parameters, R = 0.2 and 0.4, which define the maximum distance in phase space over which particles are clustered. An overall systematic uncertainty of 18–20% was achieved for the jet cross-section in the 20–125 GeV/c momentum range of the measurement.
Figure 2 shows the ratio of the inclusive differential jet cross-sections for R = 0.2 and R = 0.4, together with the predictions from QCD. This cross-section ratio is sensitive to the distribution of energy within jets and is of particular interest in the study of jets in heavy-ion collisions. The theoretical calculation agrees with the measured ratio if “hadronization” effects, which arise because the experiment measured hadrons and not partons, are taken fully into account.
These results demonstrate that ALICE can measure jets well with the advantage of precise determination of the jet structure, which is of crucial importance for studies of jet modification in PbPb collisions.
The TOTEM collaboration has published the first luminosity-independent measurement at the LHC of the total proton–proton cross-section at a centre-of-mass energy of 8 TeV. This follows the collaboration’s published measurement of the same cross-section at 7 TeV, which demonstrated the reliability of the luminosity-independent method by comparing several approaches.
The method requires the simultaneous measurements of the inelastic and elastic rates, as well as the extrapolation of the latter down to a four-momentum transfer squared, |t| = 0. This is achieved with the experimental set-up consisting of two telescopes, T1 and T2, to detect charged particles produced in inelastic proton–proton collisions, and Roman Pot stations to detect elastically scattered protons at very small angles.
The analysis at 8 TeV was performed on two data samples recorded in July 2012 during special fills of the LHC with the magnets set to give the parameter β* = 90 m. During these fills, the Roman Pots were inserted close to the beam, allowing the detection of around 90% of the nuclear elastic-scattering events. Simultaneously, the inelastic scattering rate was measured by the T1 and T2 telescopes.
By applying the optical theorem, the collaboration determined a total proton–proton cross-section of 101.7 ± 2.9 mb, which is in good agreement with the extrapolation from lower energies. The method also allows the derivation of the luminosity-independent elastic and inelastic cross-sections: σel = 27.1 ± 1.4 mb and σinel = 74.7±1.7 mb. The two measurements are consistent in terms of detector performance, showing comparable systematic uncertainties, and they are both in good agreement with the extrapolation of the lower-energy measurements.
The NOvA neutrino detector that is currently under construction in northern Minnesota has recorded its first 3D images of particle tracks. Researchers started up the electronics for a section of the first block of the NOvA detector in March and the experiment was soon catching more than 1000 cosmic rays a second. Once completed in 2014, the NOvA detector will consist of 28 blocks with a total mass of 14,000 tonnes. The blocks are made of PVC tubes filled with scintillating liquid. It will be the largest free-standing plastic structure in the world.
Fermilab, located 810 km south-east of the NOvA site, will start sending neutrinos to Minnesota in the summer. The laboratory is finalizing the upgrades to its Main Injector accelerator, which will provide the protons that produce the neutrino beam. The upgraded accelerator will produce a pulse of muon neutrinos every 1.3 seconds and the goal is to achieve a proton-beam power of 700 kW. A smaller, 330-tonne version of the far detector for NOvA will be built on the Fermilab site to measure the composition of the neutrino beam before it leaves the laboratory.
The neutrino beam will provide particles for three experiments: MINOS, located 735 km from Fermilab in the Soudan Underground Laboratory, right in the centre of the neutrino beam; NOvA, which is located off axis to probe a specific part of the energy spectrum of the neutrino beam, optimal for studying the oscillation of muon neutrinos into electron neutrinos; and MINERvA, a neutrino experiment located on the Fermilab site.
The NOvA collaboration aims to discover the mass hierarchy of the three known types of neutrino – which type of neutrino is the heaviest and which is the lightest. The answer will shed light on the theoretical framework that has been proposed to describe the behaviour of neutrinos. Their interactions could help to explain the imbalance of matter and antimatter in today’s universe; there is even the possibility that there might be still more types of neutrino.
The NOvA detector will be operated by the University of Minnesota under a co-operative agreement with the Office of Science of the US Department of Energy (DOE). About 180 scientists, technicians and students from 20 universities and laboratories in the US and another 14 institutions around the world are members of the NOvA collaboration. The scientists are funded by the DOE, the US National Science Foundation and funding agencies in the Czech Republic, Greece, India, Russia and the UK.
A new particle accelerator in the UK has achieved a significant electron acceleration milestone. On 5 April, the Versatile Electron Linear Accelerator (VELA) produced its first electron beam, an important step on the way to being ready for commercial and research use this summer.
VELA, which is situated at the Daresbury Laboratory of the Science and Technology Facilities Council, is designed to be one of the most flexible particle accelerators of its type. The medium-term aim is to develop the 6 MeV injector with additional linac sections in order to achieve 250 MeV beams at 400 Hz with bunch charges in the range 50–250 pC. At present, the beam pulses are generated by targeting a copper photo-cathode with a UV laser.
With stable, reliable beams over a broad range of energies, VELA will provide interesting new opportunities for users and collaborators. The facility is exceptional in offering access on “both sides of the wall”, allowing users not only to perform conventional studies on samples but also to access the accelerator itself. This opens up the possibility of testing a variety of accelerator components or items for beam diagnostics.
One of the primary collaborating institutes currently working on VELA is Strathclyde University. The team from Strathclyde has provided a significant level of hardware that will allow a demonstration of the capability of RF injectors for use with laser-driven plasma wakefield accelerators. The researchers plan to install an RF injector for Strathclyde’s project Advanced Laser-plasma High-energy Accelerators towards X-rays (ALPHA-X), but to date they have not been able to demonstrate a suitable performance capability. Working with VELA, however, they have developed a system that is directly suited to their application and its design is being qualified, enabling its use at the university’s facility.
The plan for VELA is to continue collaborations with other leading institutions and with industry. The aim is that the facility will allow the development of technological advances in accelerator design, for use not only in research but also in industry.
Three unusually long-lasting stellar explosions discovered by NASA’s Swift satellite represent a new class of gamma-ray bursts (GRBs). Astrophysicists conclude that they probably arose from the catastrophic death of supergiant stars hundreds of times larger than the Sun.
GRBs are extremely powerful flashes of gamma rays observed from a random direction on the sky about once a day. They are traditionally classified as short – and long-duration events. Short bursts last 2 s or less and are thought to represent a merger of compact objects (neutron stars or black holes) in a binary system. Long GRBs last up to several minutes and are probably associated with the birth of a black hole during the supernova explosion of a massive star (CERN Courier September 2003 p15). Both scenarios give rise to powerful jets that propel matter at nearly the speed of light in opposite directions. As they interact with matter in and around the star, the jets produce a spike of high-energy radiation (CERN Courier December 2005 p20).
While most of the thousands of GRBs observed so far fall into these two categories, there are also peculiar sub-energetic bursts (CERN Courier September 2004 p13) and unrelated gamma-ray events that arise from the tidal disruption of a star by a supermassive black hole (CERN Courier July/August 2011 p14). Now, three recent GRBs with extremely long duration are making astronomers consider an additional category.
The first evidence of the need for a new class of GRB came from the analysis of GRB 111209A, which erupted on 9 December 2011 and remained active for 7 hours as observed by NASA’s Swift spacecraft and several other gamma-ray, X-ray and optical instruments. The detailed study, led by Bruce Gendre while at the Italian Space Agency’s Science Data Centre, shows that the burst is a genuine GRB at a redshift of z = 0.677 but with an outstanding long duration and a high total flux.
An earlier event, GRB 101225A, exploded on Christmas Day 2010 and produced high-energy emission lasting at least two hours. Because the distance to this atypical GRB was unknown, astronomers thought that this so-called “Christmas burst” could be of a radically different nature. One group suggested an asteroid or comet falling onto a neutron star within the Galaxy, while another team suspected a merger of a neutron star with an evolved giant star to be at the origin of the burst. Both scenarios are disproved by the recent measurement of the redshift of the host galaxy by Andrew Levan of the University of Warwick and his team. They place the Christmas burst 7000 million light-years away (z=0.847), implying that the burst was far more powerful than first thought. Levan and colleagues link it with the similar GRB 111209A and another recent burst, GRB 121027A, all of extremely long duration.
Both studies propose that the ultralong duration of such atypical bursts is related to the size of the collapsing star. The duration of the event would be proportional to the time that it takes for matter to fall towards the new-born black hole at the stellar core or for the particle jets to drill their way through the star. In either case, the bigger the star the longer the duration. The likely candidates for ultralong GRBs would thus be supergiant stars with a size of hundreds of times the Sun’s diameter. Gendre’s team goes further, suggesting that GRB 111209A marked the death of a blue supergiant containing modest amounts of elements heavier than helium. This would imply that ultralong-duration GRBs would have been much more common in the distant past of the universe, when matter was not yet enriched in the heavy elements produced by massive stars.
The recent identification of the new particle discovered at the LHC as a Higgs boson with a mass of 125 GeV/c2 completes the picture of particles and forces described by the Standard Model. However, it does not mark the end of the story as, unfortunately, the Standard Model is an incomplete description of nature. Puzzles still remain, for example, in explaining the existence of dark matter and the matter–antimatter asymmetry. The answers to these puzzles may lie in the existence of as yet undiscovered particles that would have played a key role in the early, high-energy, phase of the universe and whose existence would help to complete the description of nature in particle physics. The question then is: at what energy scale would these new particles appear?
Particle physics provides no certain knowledge about this scale but the hope is that the new particles might be produced directly in the high-energy proton–proton collisions of the LHC. However, new particles could also be observed indirectly through the effects of their participation as virtual particles in rare decay processes. By studying such processes, experiments can probe mass scales that are much higher than those accessible directly through the energy available at the LHC. This is because quantum mechanics and Heisenberg’s uncertainty principle allow virtual particles to have masses that are not constrained by the energy of the system. Searches based on virtual particles are limited by the precision of the measurements, rather than the energy of the collider.
Rare potential
One promising place to look for contributions from new virtual particles is in the rare transitions of b quarks to s quarks in which a muon pair (dimuon) is produced: b → sμ+μ–. Described by the Feynman diagrams shown in figure 1 (overleaf), these involve what are known as “flavour-changing neutral currents” because the initial quark changes flavour without changing charge. In the Standard Model, transitions of this type are forbidden at the lowest perturbative order – that is, at “tree-level”, where the diagrams have only two vertices. Instead, they are mediated as shown in figure 1 by higher-order diagrams known as “electroweak penguin” and “box” diagrams. For this reason the Standard Model process is rare, which enhances the potential to discover new high-mass particles.
Studies of flavour-changing neutral currents have paved the way for discoveries in particle physics in the past, specifically in the decays of K mesons, where s quarks change to d quarks. Investigations of mixing between the mass eigenstates of the neutral kaon system and of rare K-meson decays led to the prediction of the existence of a second u-like quark (the charm quark, c), at a time when only three quarks were known (u, d and s). It was 10 years before the existence of the c quark was confirmed directly. Similarly, the observation of CP violation in neutral kaons led to the prediction of the third generation of quarks (b and t). Now, the study of flavour-changing neutral-current processes related to the third generation of quarks – in particular the rare b → sμ+μ– transitions – could soon provide similar evidence for the existence of new particles.
The LHCb detector is characterized by excellent vertex and momentum resolution.
Several b → sμ+μ– transitions have already been observed by the Belle, BaBar and CDF experiments at KEK, SLAC and Fermilab respectively. So far, the results have been limited by the small size of the data sets but with the LHC, a new era of precision has begun. The collider is the world’s largest “factory” for producing particles that contain b quarks: in one year, it produced about 1012 b hadrons in the LHCb experiment, while running at a centre-of-mass energy of 7 TeV with an instantaneous luminosity in the experiment of 4 × 1032 cm–2 s–1. ATLAS and CMS have also recently joined the game, showing their first results on the B0 → K*0 μ+μ– decay at the BEAUTY 2013 conference (ATLAS collaboration 2013 and CMS collaboration 2013).
The LHCb detector is characterized by excellent vertex and momentum resolution (coming from its tracking systems) and impressive particle-identification capabilities (from its two ring-imaging Cherenkov detectors). Combined with the large b-hadron production rate, these features allow LHCb to reconstruct clean signals of rare b-hadron decays (figure 2). These processes have branching fractions below 10–6 and at most occur once in every 100 million collisions.
The branching fractions of these decays are sensitive to new physics but their interpretation is unfortunately complicated. The b quark has hadronized, so the observations relate to hadronic rather than quark-level processes. A lack of detailed understanding of the hadronic system limits the usefulness of the branching-fraction measurements in the search for new physics.
Angles and asymmetries
Fortunately, the branching fractions of these decays are not the only handles for investigating new particle contributions. It is often much more instructive to look at the angular distribution of the particles coming from the decay. However, such angular analyses are experimentally challenging because they require a detailed understanding of how both the geometry of the detector and the reconstruction of the event bias the angular distribution of the particles.
The decays B0 → K*0μ+μ– and Bs→ φμ+μ– have been shown to be highly sensitive to a variety of new physics scenarios (LHCb collaboration 2013a and 2013b). These decays are characterized by three angles: θK, which describes the K* or φ decay; θl, which describes the dimuon decay; and Φ, the angle between the K* or φ and the dimuon decay planes.
The angular distribution of the particles depends on the properties of the underlying theory. For instance, two features of the Standard Model drive the angular distribution: the photon exchanged in the penguin diagram of figure 1 is transversely polarized, while the charged-current interaction (the W exchange) is purely left-handed. The angle in the dimuon system also has an intrinsic forward–backward asymmetry that arises from interference between the different diagrams. The forward–backward asymmetry can be studied as a function of the mass of the dimuon system, which can be anywhere between twice the muon’s mass and the difference between the mass of the B and the mass of the K* or φ.
In the Standard Model, the forward–backward asymmetry has a characteristic behaviour, changing sign at a dimuon mass of around 2 GeV/c2. It turns out that this point can be predicted with only a small theoretical uncertainty. Figure 3 shows LHCb’s measurement of the forward–backward asymmetry in the decay B0 → K*0μ+μ– . In addition, the angle Φ can be used to test nature’s left-handedness, through an observable called AT(2).
It is important to emphasize that the room for new physics is still large given the statistical uncertainty of the present measurements.
So far, measurements of both the forward–backward asymmetry and AT(2) show good agreement with the predictions of the Standard Model. While there is no evidence for any disagreement, it is nevertheless important to emphasize that the room for new physics is still large given the statistical uncertainty of the present measurements.
Another way to decrease the theoretical uncertainty associated with the hadronic transitions is to form asymmetries between specific decay modes – for example, CP asymmetries between particle and antiparticle decays. In the Standard Model, the decay B0 → K*0μ+μ– and its CP conjugate are expected to have the same branching fraction to about 1 part in 1000. With the large LHC data samples, LHCb has verified this at the level of 4% (LHCb collaboration 2013e).
Another example concerns so-called isospin asymmetries between decays that differ only in the type of spectator quark (u or d), labelled q in figure 1. The isospin asymmetry between B0 and B+ decays is defined as:
This is formed using the branching fractions of the B0 and B+ decays and the ratio τ0/τ+ of the lifetimes of the B0 and the B+. In the Standard Model, the spectator quark is expected to play only a limited role in the dynamics of the system, so isospin asymmetries are predicted to be tiny. Experimentally, AI is measured as a double ratio with respect to the decay channels B0 → K(*)0 J/ψ or B+ → K(*)+ J/ψ, which give the same final states after the J/ψ decays to μ+μ– and are well known from previous measurements.
Isospin asymmetries have been measured for both B→ K*μ+μ– and B→ Kμ+μ– by the BaBar, Belle, CDF and LHCb experiments. All of these measurements are in good agreement with each other and favour a value for AI(B→ K* μ+μ–) that is close to zero and a negative value for AI(B→ Kμ+μ–). The LHCb experiment observes a negative isospin asymmetry in this channel at the level of four standard deviations (from zero) as figure 4 shows (LHCb collaboration 2012). This unexpected result is yet to be explained. Indeed, most extensions of the Standard Model do not predict a significant dependence on the charge or flavour of the spectator quark.
Looking to the future
The LHCb experiment has already on tape a data set that is roughly three times larger than that used in its results published so far. Even with only 1 fb–1 of integrated luminosity currently analysed, LHCb has larger samples than all previous experiments combined in most of the channels shown in figure 2. Furthermore, while the selected current data sets contain hundreds of events, the samples will be of the order of tens of thousands of events once the experiment has been upgraded. With these larger data sets the LHCb collaboration will be able to chase progressively smaller and smaller deviations from the Standard Model. This will allow them to probe ever higher mass scales, far beyond those that can be accessed by searching directly for the production of new particles at the LHC. A new era in precision measurements of flavour-changing neutral currents is now opening.
“Mary K [Gaillard], Dimitri [Nanopoulos], and I first got interested in what are now called penguin diagrams while we were studying CP violation in the Standard Model in 1976 … The penguin name came in 1977, as follows.
In the spring of 1977, Mike Chanowitz, Mary K and I wrote a paper on GUTs [grand unified theories] predicting the b quark mass before it was found. When it was found a few weeks later, Mary K, Dimitri, Serge Rudaz and I immediately started working on its phenomenology.
That summer, there was a student at CERN, Melissa Franklin, who is now an experimentalist at Harvard. One evening, she, I and Serge went to a pub, and she and I started a game of darts. We made a bet that if I lost I had to put the word penguin into my next paper. She actually left the darts game before the end, and was replaced by Serge, who beat me. Nevertheless, I felt obligated to carry out the conditions of the bet.
For some time, it was not clear to me how to get the word into this b-quark paper that we were writing at the time … Later … I had a sudden flash that the famous diagrams look like penguins. So we put the name into our paper, and the rest, as they say, is history.”
John Ellis in Mikhail Shifman’s “ITEP Lectures in Particle Physics and Field Theory”, hep-ph/9510397. Reproduced here courtesy of symmetry magazine.
The Nuclotron-based Ion Collider fAcility (NICA) is the future flagship project of the Joint Institute for Nuclear Research in Dubna. In addition to the existing Nuclotron, this accelerator-collider complex will include a new heavy-ion linear accelerator, a superconducting 25 Tm booster synchrotron and two rings for a superconducting collider. The new facility will ultimately provide a range of different ion beams for a variety of experiments with both colliding beam and fixed targets (see box).
Construction of the 3 MeV/u heavy-ion linear accelerator is now under way in co-operation with the BEVATECH Company in Germany; its commissioning in Dubna is scheduled for the end of 2013. Serial production of superconducting magnets for the booster is expected to start in early 2014. The Technical Design Report for the collider complex has meanwhile been approved. As the first step in the realization of the NICA heavy-ion programme, Baryonic Matter at Nuclotron (BM@N) – a new fixed-target experiment developed in co-operation with GSI, Darmstadt – has been approved by JINR’s Programme Advisory Committee and Scientific Council and is now under construction.
In the meantime, the modernized Nuclotron, which will be a key element of the future facility, is being used for basic research in accelerator physics and techniques, the development of modern diagnostics and the testing of prototypes for the collider and booster systems. This is in addition to the implementation of the current physics programme at the superconducting 45 Tm synchrotron. Development work for NICA performed during recent Nuclotron runs include the testing of elements and prototypes for the Multipurpose Detector using extracted deuteron beams; the transportation of the extracted beam (C6+ ions at 3.5 GeV/u and deuterons at 4 GeV/u) to the point where the BM@N detector is under construction; tests of the Nuclotron operating with a long flat-top of the high magnetic field (up to 1000 s, 1.5 T) to simulate the operating conditions of the magnetic system for the collider; and operational tests of the automatic control system based on the TANGO platform, which has been chosen for the NICA facility.
A particularly important step concerned the construction, installation and testing at the Nuclotron of the prototype for the collider’s stochastic cooling system
A particularly important step concerned the construction, installation and testing at the Nuclotron of the prototype for the collider’s stochastic cooling system. This is of major importance for NICA’s heavy-ion programme because beam cooling during collisions is essential for providing maximal luminosity across the whole energy range of 1–4.5 GeV/u. Operational experience of stochastic cooling and experimental investigations of the beam-cooling process at the Nuclotron are therefore a necessity.
The design and construction of the stochastic-cooling channel at the Nuclotron began in mid-2010 in close collaboration with the Forschungszentrum Jülich (FZJ). All stages of the work have been strongly supported by the director of the FZJ’s Institute for Nuclear Physics (IKP), Rudolph Mayer. This R&D is also important to IKP FZJ for testing elements of the stochastic-cooling system designed for the High-Energy Storage Ring (HESR), which will form part of the future international Facility for Antiproton and Ion Research in Darmstadt.
The main task of beam cooling at the HESR will be to accumulate a beam with 1010 antiprotons above 3 GeV at a momentum resolution down to 10–5 for the PANDA experiment. To enhance beam-cooling performance, new ring slot couplers have been developed at FZJ for the pick-up and kicker structures. The pick-ups were tested successfully at the Cooler Synchrotron at FZJ in experiments with the internal target of the Wide-Angle Shower Apparatus.
A pick-up and kicker, each assembled from 16 rings designed for a 2.4 GHz bandwidth, were produced at FZJ for testing at JINR, as the institutes joined forces to prepare for an experiment on stochastic cooling at the Nuclotron. The kicker structure was installed in the room-temperature section of the Nuclotron, with the pick-up structure in the cold section on the opposite side of the 251-m circumference ring, operating at 4.5 K. The first experiments aimed at achieving longitudinal cooling using the filter method. The notch filter and tunable system-delay were implemented on optical lines and a maximum power of 20 W was chosen for the final amplifier.
Construction of the system, its assembly and the cryogenic tests were completed in the autumn of 2011. Then, in December 2011, the equipment was tested for the first time in Nuclotron run 44 with C6+ and deuteron beams. The performance of the system was improved following the results of these first tests, and the software required to adjust the system was developed. This enabled the recent successful test during Nuclotron run 47 in February and March this year, when the system was adjusted to cool the coasting 3 GeV/u deuteron beam and on 20 March the decrease in its momentum spread was demonstrated (figure 1).
To make the effect more observable, the initial momentum spread was increased artificially by manipulation of the RF voltage at the final stage of the beam acceleration. The beam-cooling time of about 360 s is in reasonable agreement with the simulations. Details of this experiment are to be presented at the COOL13 conference in June. Another important result from the recent run was the increase of the maximum deuteron beam energy delivered for physics experiments up to 4.8 GeV/u.
The experimental investigation of stochastic cooling was a complex test of machine performance. During the Nuclotron run, the cryogenic and magnetic systems, power supply and quench-protection systems, cycle control and diagnostic equipment were operated stably in a mode in which the circulation time of the accelerated beam at the flat-top of the magnetic field gradually increased from a few tens of seconds up to eight minutes. The safe operation of the magnetic system was guaranteed by a new quench-detection system commissioned during the run. It permits a prompt change in the number of detectors, combining the work on the group and individual detectors. The detectors for this new method and their automatic control systems were developed at the Nuclotron and have been chosen for manufacture and installation on the NICA booster. The system provides monitoring of the statuses of all of its components, as well as signal testing of external systems, and also indicates malfunctions.
These tests were the result of an international team effort: A Sidorin, N Shurkhno, G Trubnikov (JINR, Dubna) and R Stassen (IKP FZJ) supervised all stages of the system design and participated in the Nuclotron shifts dedicated to testing and adjusting the equipment; T Katayama and H Stockhorst (GSI and IKP FZJ) performed simulations of the cooling-process dynamics and experimental measurements; L Thorndahl and F Caspers (CERN) contributed to the design and simulation of RF structures.
NICA's Objectives
The NICA facility will provide experiments with:
• extracted ion beams (from protons up to gold or uranium nuclei) at kinetic energies up to 13.8 GeV (for protons), 6 GeV/u (for deuterons) and 4.5 GeV/u for heavy nuclei. The fixed-target experimental BM@N is under construction by a JINR-GSI collaboration;
• colliding heavy-ion beams with a kinetic energy in the range 1–4.5 GeV/u at a luminosity of 1027 cm–2 s–1;
• colliding heavy and light ions with the same energy range and luminosity;
• colliding polarized beams of light ions in the kinetic energy range 5–12.5 GeV/u for protons and 2 – 5.8 GeV/u for deuterons, at a luminosity level not less than 1031 cm–2 s–1.
NICA’s beams will be available to these experimental areas and facilities:
• 10,000 m2 experimental hall for fixed-target experiments, using slow, extracted beams from the Nuclotron;
• the dedicated experimental hall for applied research on extracted ion beams from the booster;
• the collider of heavy and light polarized ions, equipped with the MultiPurpose Detector and the Spin Physics Detector for fundamental research;
• an internal target station in the Nuclotron cryomagnetic system for research, including relativistic atomic physics and spin physics.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.