Comsol -leaderboard other pages

Topics

Precise mass measurements may help decode X-ray bursts

Researchers at the Michigan State University (MSU) National Superconducting Cyclotron Laboratory (NSCL) have made precise mass measurements of four proton-rich nuclei, 68Se, 70Se, 71Br and an excited state of 70Br. The results may make it easier to understand type I X-ray bursts, the most common stellar explosions in the galaxy.

These bursts occur in the hot and dense environment that arises when a neutron star accretes matter from a companion star in a binary system. In these circumstances, rapid burning of hydrogen and helium occurs through a series of proton captures and beta decays known as the rp process, releasing an energy of 1032–1033 J in the form of X-rays in a burst 10–100 s long. Generally the capture-decay sequence happens in a matter of seconds or less, but “waiting points” occur at the proton dripline, where the protons become too weakly bound and the slower beta-decays intervene.

One of the major waiting points involves 68Se, which has 34 neutrons and 34 protons, and closely related nuclei. The lifetimes of these nuclei influence the light curve of the X-ray burst as well as the final mix of elements created in the burst process. The lifetimes of the waiting points in turn depend critically on the masses of the nuclei involved, which also influence the possibility for double-proton capture that can bypass the beta-decay process and hence the waiting point.

The experiment at NSCL, conducted by Josh Savory and colleagues, used the Low Energy Beam and Ion Trap facility, LEBIT, for the mass measurements of the four nuclei. The nuclides themselves were produced by projectile fragmentation of a 150 MeV/u primary 78Kr beam and separated in flight by the A1900 separator. LEBIT takes isotope beams travelling at roughly half the speed of light and then slows and stops the isotopes for highly accurate mass measurements via Penning-trap mass spectrometry.

The experiment was able to reach uncertainties as low as 0.5 keV for 68Se to 15 keV for 70mBr, with up to 100 times improvement in precision (for 71Br) in comparison with previous measurements. The team then used the new measurements as input to calculations of the rp process and found an increase in the effective lifetime of 68Se, together with more precise information on the luminosity of a type I X-ray burst and on the elements produced.

PETRA III stores its first positron beam

CCnew8_05_09

DESY’s new third-generation synchrotron radiation source, PETRA III, accelerated its first beam on 16 April. At 10.14 a.m. the positron bunches were injected and stored in the 2.3 km accelerator for the first time. The start of operation with the beam concludes a two-year upgrade that converted the storage ring PETRA into a world-class X-ray radiation source.

As the most powerful light source of its kind, PETRA III will offer excellent research possibilities to researchers who require tightly focused and very short-wavelength X-rays for their experiments. In particular, PETRA III will have the lowest emittance – 1 nm rad – of all high-energy (6 GeV) storage rings worldwide.

Following the stable storage of the particle beam achieved on 16 April, the accelerator is now being set up for the production of synchrotron radiation. The undulators – the magnets that ensure the machine’s high brilliance in X-rays – will be positioned to force the beam to oscillate and emit the desired intense, short-wavelength radiation. At the same time, the mounting of the 14 beamlines to be used for experiments continues in the newly constructed experiment hall. A first test run with synchrotron radiation is planned for this summer; regular user operation of the new synchrotron radiation source will start in 2010.

The PETRA storage ring began life as a leading electron–positron collider in the 1980s and later became a pre-accelerator for the electron/positron–proton collider, HERA. The remodelling into PETRA III at a cost of €225 million was funded mainly by the Federal Ministry of Education and Research, the City of Hamburg and the Helmholtz Association.

Editor’s note

CCnew9_05_09

It is 400 years since Galileo Galilei looked at the heavens through a telescope and changed our view of the universe for ever. In celebration and to stimulate worldwide interest in astronomy and science, the International Astronomical Union (IAU) and UNESCO have initiated the International Year of Astronomy 2009 (IYA2009).

Particle and nuclear physics may deal with the smallest components of matter, but both have strong links with astronomy – the news story above is just one example. This issue of CERN Courier celebrates IYA2009 with this and several longer articles. Nobel laureate George Smoot considers the exciting times in modern cosmology (Cosmology’s golden age), while features on Borexino and MAGIC (Borexino homes in on neutrino oscillations and A MAGIC touch brings astronomical delights) look at two of the many experiments in the new field of astroparticle physics. Lastly, Viewpoint (Big Science, bigger outreach) considers a valuable message these “big” sciences offer to the public at large.

Fermi measures the spectrum of cosmic-ray electrons and positrons

The Fermi Gamma-Ray Telescope can find out about more than gamma rays. It has now provided the most accurate measurement of the spectrum of cosmic-ray electrons and positrons. These results are consistent with a single power-law, but visually they suggest an excess emission from about 100 GeV to 1 TeV. The additional source of electrons and positrons could come from nearby pulsars or dark-matter annihilation.

The characterization of cosmic rays is undergoing some profound investigation. In addition to the ground-based Pierre Auger and Milagro observatories, balloon- and space-borne experiments are measuring the spectrum of electrons and positrons. Before 2008 this spectrum was only determined to within a factor of two or three, by balloon-borne experiments and by the Space Shuttle flight of the Alpha Magnetic Spectrometer (AMS) in 1998 (CERN Courier June 1999 p6). Recently, however, results of the Advanced Thin Ionization Calorimeter (ATIC) have had a huge impact. This balloon experiment flown above Antarctica suggested a strong excess of electrons and positrons at energies of 300–800 GeV. The spectrum obtained has a peak that is consistent with the annihilation of dark-matter particles with an energy of about 600 GeV (Chang et al. 2008). Then, earlier this year, the Payload for Antimatter Matter Exploration and Light-nuclei Astrophysics (PAMELA) space experiment found a clear excess of positrons over electrons at energies above 5 GeV (CERN Courier May 2009 p12).

The story continues to unfold with results from the Large Area Telescope (LAT) on board the Fermi mission (CERN Courier November 2008 p13). The LAT is sensitive to electromagnetic cascades generated inside the detector by either incoming gamma rays or by cosmic rays. Using only the first six months of data the Fermi collaboration obtained the most accurate measurement yet of the spectrum of electrons plus positrons in the 20 GeV to 1 TeV range (Abdo et al. 2009). The contamination of gamma rays in the electron/positron sample is estimated to be less than 2% and hadronic events could also be well discriminated against, such that the maximum systematic error remains below 20%, even at 1 TeV. The published spectrum is consistent with a simple power law when statistical and systematic errors are conservatively taken into account. The fitted power law falls with energy as E–3.0, which is slightly harder than the expectation from a conventional model of diffusive electron propagation in the Milky Way.

Visually, however, the Fermi data suggest a deviation from a simple power law above about 100 GeV. There is evidence for a bump at these energies, suggesting an additional component of primary electrons. The bump resembles the excess found by ATIC, but is much less pronounced. It can be put in parallel with the increased positron fraction derived by PAMELA, but the latter is at energies that are 10 times lower. Although inconsistent with each other, all three experiments suggest that there is an additional component of electrons and positrons towards higher energies. The origin of the excess could be dark-matter annihilation, but there are also alternative explanations, in particular the contribution of nearby pulsars. These possibilities are not discussed in the published paper but the collaboration foresees that Fermi will significantly improve the understanding of the electron spectrum in the coming months, in particular by searching for anisotropies in the arrival direction of the electrons.

Further reading

A A Abdo et al. 2009 Phys. Rev. Lett. 102 181101.

J Chang et al. 2008 Nature 456 362.

Borexino homes in on neutrino oscillations

CCnew9_05_09

The mystery of the “missing” solar neutrinos arose in the 1970s when the pioneering experiment by Raymond Davis and colleagues in the Homestake Mine in South Dakota detected only one-third or so of the number of electron-neutrinos from the Sun that they expected. It was 30 years before this puzzle was solved, when the Sudbury Neutrino Observatory (SNO) confirmed the proposal that the neutrinos change type on their way from the centre of the Sun, reducing the number of electron-neutrinos arriving at the Earth. Such oscillations from one type to another can only occur if the neutrinos detected are mixtures of states with some difference in mass, in turn implying that neutrinos must have mass – a finding that lies beyond the Standard Model of particle physics.

Solar neutrinos have for the past 40 years been detected either by exploiting radiochemical techniques or by the detection of Cherenkov radiation. The Homestake detector exemplified the radiochemical method, with electron-neutrinos interacting with 37Cl to produce 37Ar, which was then extracted and detected through its radioactive decay. SNO, on the other hand, used heavy water to detect Cherenkov radiation from charged particles that were produced by neutrino interactions in the liquid. The results from all of the various experiments are best described by the theoretical description of neutrino oscillation developed by Stanislav Mikheyev, Alexei Smirnov and Lincoln Wolfenstein (MSW), and in particular the solution with a large mixing angle (LMA) between the mass states.

Towards the MSW-LMA scenario

CCrex1_05_09

To explain the flux of electron-neutrinos relative to the total flux of solar neutrinos observed in SNO, as well as the results from Homestake and other experiments, the MSW-LMA mechanism requires two different regimes for neutrino oscillation: resonant, matter-enhanced oscillations in the dense core of the Sun for energies above 5 MeV (as in SNO); and vacuum-driven oscillations for low energies, below 2 MeV (as in the gallium radiochemical experiments, GALLEX, its successor the Gallium Neutrino Observatory and SAGE). Now, for the first time, the Borexino Experiment at the Gran Sasso National Laboratories has found experimental evidence for the transition between these two oscillation regimes by detecting in real time both low-energy (0.862 MeV) and high-energy (3–16 MeV) solar neutrinos, from 7Be and 8B, respectively. These nuclei are both formed in certain branches of the principal chain of reactions that converts hydrogen to helium at the Sun’s core – the so-called proton–proton (pp) chain, which starts with the pp process, p+p→d+e++ νe. While the 7Be neutrinos form 7% of the neutrinos that emanate from the Sun, the 8B neutrinos above 5 MeV correspond to only 0.006% of the total flux.

Borexino consists of an unsegmented liquid-scintillator detector with a target mass of 278 t of pseudocumene (C9H12) doped with 1.5 g/l of PPO (2,5 diphenyloxazole). The scintillator is contained inside a thin (125 μm) nylon vessel that is shielded against external background by a second nylon vessel and about 1 kt of buffer, which consists of pseudocumene mixed with 5 g/l of light quencher (dimethylphthalate). A total of 2212 8-inch photomultipliers mounted on a 13.7 m diameter stainless-steel sphere (SSS) detect the scintillator light. The SSS works as a containment vessel for both the scintillator and the buffer. It is installed inside a tank containing 2100 t of high-purity water.

The 7Be measurement

CCrex2_05_09

One of the main research goals for Borexino is the detection of the solar neutrinos emanating from the electron-capture reactions of 7Be, which occurs in 15% of the conversions through the proton–proton chain. The 7Be neutrinos are monoenergetic (0.862 MeV, with a 90% branching ratio) and in Borexino they are detected via elastic scattering between neutrinos and electrons. The 7Be solar neutrinos offer a unique way to tag events: the kinematic Compton-like edge at 0.665 MeV. This is an important feature because solar-neutrino interactions cannot be disentangled from the residual beta-decay radioactivity arising from natural contaminants that are present in the scintillator. Figure 1 shows the expected solar-neutrino spectrum in Borexino, emphasizing the signal from the 7Be neutrinos.

CCrex3_05_09

The intrinsic radiopurity level of the scintillator is the main experimental challenge for such a detector. In Borexino, after five years of R&D, we developed purification methods that allowed us to achieve excellent purity, with intrinsic 238U and 232Th contamination levels of less than 1 in 1017. This level of radiopurity – a record in the field – allows us to study neutrino interactions in real time at, and below, 1 MeV. It also opens up new research windows such as:

• the possibility of detecting, in real time, neutrinos from the pep reaction and the CNO chain

• measuring low-energy 8B neutrinos through the reaction 13C(νe,e)13N

• searching for rare processes with very high sensitivity, such as probing the Pauli-exclusion principle at the level of >1030 y–1 by searching for non-Paulian transitions in 12C nuclei (Derbin 2008).

Borexino has been taking data since May 2007. After a few months a clear signal in the energy spectrum of events detected in the fiducial mass of about 80 t revealed the first detection of 7Be solar neutrinos (Borexino Collaboration, Arpesella et al. 2008). This observation allowed the first direct determination of the electron-neutrino survival probability, Pee, below 1 MeV. The MSW-LMA model predicts two regimes for Pee: namely, below 1 MeV, with Pee ˜0.6; and above 2 MeV, with Pee ˜0.3. Prior to Borexino only radiochemical experiments could probe the energy region below 1 MeV and they all measured an integrated solar-neutrino flux above a certain threshold – the threshold for the electron-neutrino capture interaction. The observation of 7Be neutrinos by Borexino provides a result of Pee=0.56±0.10 at 0.862 MeV, which is in good agreement with the MSW-LMA prediction (Borexino Collaboration, Alimonti et al. 2008).

This measurement casts light on another unresolved aspect of the physics of the solar core: the ratio of helium production via the pp chain and a cycle that involves carbon, nitrogen and oxygen (the CNO cycle). When taken all together, the integrated rates measured by Homestake and the gallium experiments are a function of the fluxes of solar neutrinos from pp, 7Be, the CNO cycle and the decay of 8B. Therefore, using the Borexino result on 7Be neutrinos, it is possible to study the correlation between the pp and CNO fluxes. Figure 2 shows contours at the 68%, 90% and 99% confidence levels for the combined estimate of the pp and CNO fluxes, normalized to the predictions of the Solar Standard Model (SMM). The 8B flux is fixed by the Cherenkov experiments (Super-Kamiokande and SNO).

CCrex4_05_09

As figure 2 shows, the measurement of 7Be neutrinos is important for the study of a fundamental parameter, the flux of pp neutrinos, which are the most abundant solar neutrinos produced in the core of the Sun. The theory for beta-decay, with some extension, allows the calculation of the basic pp→d+e+νe cross-section, which at 1 MeV is around 10–47 cm2. Measuring such a small value is beyond the reach of current technology, so the cross-section for this important process – which drives the evolution of the Sun – can only be determined theoretically. A check of the flux predicted by the SMM for pp neutrinos is therefore important.

Figure 2 makes use of the luminosity constraint – a specific linear combination of solar-neutrino fluxes that corresponds to the measured photon-solar luminosity, assuming that nuclear-fusion reactions are responsible for generating energy inside the Sun. It leads to a value of fpp=1.04+0.13–0.19 with the luminosity constraint; without the constraint fpp=1.005+0.008–0.020. These are the best measurements of the pp solar-neutrino flux. The result on fCNO translates into a CNO contribution to the solar luminosity of <5.4% (90% CL); the current SMM predicts a contribution of order 1%.

Borexino has also recently performed a measurement of the 8B solar-neutrino flux above 3 MeV, which was possible because of the high radiopurity achieved. Prior to Borexino, 8B neutrinos were measured above 5 MeV using Cherenkov detectors. The results from these experiments agree well with Borexino’s measurement.

CCrex5_05_09

The measurement of the 8B flux allows a determination of the corresponding value of Pee at an effective energy (taking into account the spectrum of 8B neutrinos) of 8.6 MeV. So by detecting 8B neutrinos Borexino has measured Pee simultaneously at 0.862 MeV and at 8.6 MeV (figure 3). Disregarding systematic effects, which are the same for the measurement of Pee at low and high energies, the result determines a difference at about 2σ for Pee for 7Be and 8B neutrinos. The measured ratio of the survival probability for 7Be and 8B neutrinos is currently 1.60±0.33 (Borexino collaboration, Bellini et al. 2008). Using other solar-neutrino observations it is also possible to determine Pee for pp neutrinos, which figure 3 also shows. Combined, these results confirm for the first time the vacuum-matter transition predicted by the MSW-LMA scenario at today’s accuracy.

• Borexino at the Gran Sasso Laboratory is an international collaboration funded by INFN (Italy); NSF (US) for Princeton University, Virginia Tech, University of Massachusetts Amherst; BMBF and DFG (Germany) for MPI für Kernphysik Heidelberg, TU München; Rosnauka (Russia) for RRC Kurchatov Institute and JINR; MNiSW (Poland) for Institute of Physics Jagellonian University; and Laboratoire APC Paris.

Cosmology’s golden age

CCnew9_05_09

La verità è il destino per il quale siamo stati fatti (Truth is the destiny for which we were made)”. This article gives an example of how “truth” is achieved through “discovery” – the method used in science. By revealing nature, discovery is the way in which we can achieve truth, or at least glimpse it. But how can we know or have confidence that we have made a correct discovery? Here we can look to the major architect of the scientific method, Galileo Galilei: “La matematica è l’afabeto nel quale Dio ha scritto l’Universo” (Mathematics is the language with which God has written the universe). A discovery will be described best – and most economically and poetically – mathematically.

Virtual space flight

There has never been a more exciting time for cosmologists than now. Through advanced techniques and ingenious, and often heroic observational efforts, we have obtained a direct and extraordinarily detailed picture of the universe – from very early times to the present. I recently had the pleasure of using a specially outfitted planetarium at the Chabot Observatory Space and Science Center in Oakland, California, and taking a virtual flight through the universe on a realistic (though often faster-than-light) journey based on real astronomical data.

CCcos1_05_09

We took off from the surface of the Earth and zoomed up to see the International Space Station at its correct location in orbit. When we first arrived we could only see a dark region moving above the Earth but soon the space station’s orbit brought it out of the Earth’s shadow into direct Sun light. We circled round, looking at it from all sides and then swiftly moved on to see the solar system with all the planets in their correct current locations. After a brief visit to the spectacular sight of Saturn we continued out to see the stars in our neighbourhood before moving on, impatient to see the whole galaxy with all the stars in the positions determined by the Hipparchos Satellite mission. After that we travelled farther out to see our local group of galaxies dominated by our own Milky Way and the Andromeda galaxy.

Moving more and more quickly we zoomed out and saw many clusters of galaxies. I was having trouble deciding quickly enough which supercluster was Coma, Perseus-Pisces or Hydra-Centaurus when viewed from an arbitrary location and moving through the universe so fast. Then, using the latest galaxy survey data, we went out farther to where we were seeing half-way to the edge of the observable universe. All the galaxies were displayed in their observed colours and locations – millions of them, admittedly only a fraction of the estimated 100 billion in the visible universe, but still incredibly impressive in number and scope, revealing the web of the cosmos.

We were actually moving through time as well as space. As we went farther away from the Earth we were at distances where light takes a long time to reach our own planet, so we were looking at objects with a very much younger age (earlier in time). It was fun flying round through the universe at hyperfaster-than-light speed and seeing all of the known galaxies. Soon I asked to see to the edge (and beyond). The operator brought up the data for the cosmic microwave background (CMB) – at the time, the 3-year maps from the Wilkinson Microwave Anisotropy Probe – and it appeared behind the distance galaxies. I asked to move right to the edge, and in the process of zooming out we went past the CMB map surface and were looking back at the sphere containing the full observable universe. Where were we? Out in the part from which light has not had time to reach Earth and – if our current understanding is correct – will never reach us. But still we wonder about what is out there, and we have some hope of understanding.

The second reason why this is such an incredibly exciting time in cosmology is that these observations, combined with careful reasoning and an occasional brilliant insight, have allowed us to formulate an elegant and precisely quantitative model for the origin and evolution of the universe. This model reproduces to high accuracy everything that we observe over the history of the universe, images of which are displayed in the planetarium.

CCcos2_05_09

We now have precise observations of a very early epoch in the universe through the images made using the CMB radiation and we hope to start a newer and even more precise and illuminating effort with the launch of the Planck Mission on 14 May. However, we also have many impressive galaxy surveys and plans for even more extensive surveys using new ideas to see the relics of the acoustic oscillations in the very, very early universe, as well as the gravitational lensing caused by the more recently formed large-scale structures, such as clusters of galaxies that slightly warp the fabric of space–time by their presence. Each will give us new images and thus new information about the overall history of the universe.

However, the model invokes new physics; some explicitly and some by omission. First, we put in inflation, the physical mechanism that takes a small homogeneous piece of space–time and turns it into something probably much larger than our currently observable universe but with all its features, including the very-small-amplitude fluctuations discovered with the Differential Microwave Radiometers on the Cosmic Background Explorer, which are the seeds of modern galaxies and clusters. Second, we put in dark matter, which plays the key role in the formation of structure in the universe and holds the clusters and galaxies together. This is a completely new kind of matter – unlike any other with which we have experience. It does not interact electromagnetically with light but apparently does interact gravitationally, precisely the property needed for it to form structure. A third additional ingredient is dark energy, which is used to balance the energy budget of the universe and explain the accelerating rate of expansion observed in the more recent history of the universe. Last, we need baryogenesis, the physical mechanism that explains the dominance of matter over antimatter. We have good reason to believe that there were equal amounts of matter and antimatter at the very beginning, but now matter prevails.

If we add these four extra ingredients in the simplest possible form we can reproduce the observable universe in our simulations or analytic calculations to an accuracy that is equal to (and probably better than) the current observational accuracy – at roughly the per cent level.

There are other things that we don’t put in so explicitly but have reason to suspect might be there. For example, we work with a universe constrained by three large dimensions of space and one of time, even though we know that more dimensions are possible and may be necessary. We do not deal with our confinement to 4D. We also stick with the four known basic forces even though there is plenty of opportunity for new forces; and likewise for additional relics from earlier epochs.

Universal ingredients

The success of the standard cosmological model has many consequences that puzzle us and also raises several key questions, which are far from answered. The observation of dark energy demonstrates that our well established theories of particles and gravity are at least incomplete – or not fully correct. What makes up the dark side of the universe? What process, in detail, created the primordial fluctuations? Is gravity purely geometry as Albert Einstein envisaged, or is there more to it (such as scalar partners and extra dimensions)? An unprecedented experimental effort is currently being devoted to address these grand-challenge questions in cosmology. This is an intrinsically interdisciplinary issue that will inevitably be at the forefront of research in astrophysics and fundamental physics in the coming decades. Cosmology is offering us a new laboratory where standard and exotic fundamental theories can be tested on scales not otherwise accessible.

The situation in cosmology is rife with opportunities. There are well defined but fundamental questions to be answered and new observations arriving to guide us in this quest. We should learn much more about inflation from the observations that we can anticipate over the next few years. Likewise we can hope to learn about the true nature of dark matter from laboratory and new accelerator experiments that are underway or soon to be operating, as at the LHC. We hope to learn more about possible extra dimensions through observations.

We continue to seek and encourage new ideas and concepts for understanding the universe. These concepts and ideas must pass muster – like a camel going through the eye of a needle – in agreeing with the multitude of precise observations and thereby yield an effective version of our now-working cosmological model. This is the key point of modern cosmology, which is fully flowering and truly exciting. It is the natural consequence and culmination of the path that Galileo started us on four centuries ago.

A MAGIC touch brings astronomical delights

CCnew9_05_09

In a simple ceremony on a mountaintop under blue skies and bright sunlight on 25 April, a small group of colleagues, family and friends paused in silence in memory of a young physicist who died there last September. Florian Goebel suffered a fatal accident while putting the finishing touches to the MAGIC-II telescope at the Roque de Los Muchachos Observatory on the Canary Island of La Palma. He had been the project manager and it is fitting that the two MAGIC telescopes were named the Florian Goebel Telescopes at the ceremony. Shortly afterwards, Florian’s brother helped to cut the white, blue and yellow ribbons that symbolically held the telescope, releasing it for its “first light”.

MAGIC-II thus joined its older sibling, MAGIC-I, in exploring the gamma-ray sky, each with a larger segmented mirror than any other reflecting telescope. While MAGIC-I has already made major discoveries, together the two telescopes will make simultaneous observations and achieve a sensitivity three times greater than when working independently.

CCmag1_05_09

The Major Atmospheric Gamma-Ray Imaging Cherenkov (MAGIC) project is one of four around the world that use reflecting telescopes to detect the short bursts of Cherenkov light emitted by the showers of charged particles produced when a high-energy gamma ray interacts in the Earth’s upper atmosphere. The High Energy Stereoscopic System (HESS) has been operating in the highlands of Namibia since September 2004; the Very Energetic Radiation Imaging Telescope Array System (VERITAS) in Arizona began it first observations in 2003; and the CANGAROO collaboration between Australia and Japan, which has been observing gamma-ray sources since 1992, began full operation of its most recent telescopes in 2004.

A textbook example

CCmag2_05_09

Gamma-rays reveal the highest-energy phenomena in the universe, but a major goal for MAGIC is to extend observations to lower gamma-ray energies, which will allow it to see deeper into the universe and farther back in time. At lower energies, the gamma rays are less likely to interact with other light on their long journey through space.

The story of the MAGIC project is a textbook example of the merging of particle physics and astronomy into the modern field of astroparticle physics. For many years, Eckart Lorenz from MPI Munich was a familiar face at CERN and other particle physics laboratories, working in a number of well known collaborations involving the Munich group. By the 1990s he began to apply his expertise in particle-detection techniques to the study of high-energy cosmic gamma rays, in particular by using imaging atmospheric Cherenkov telescopes.

Detecting the Cherenkov light emitted as charged particles pass through a medium faster than light does is a well known technique in particle physics. The method is used to identify charged particles according to their velocities, as implemented for example in the LHCb experiment at CERN. The radiation forms a cone about the particle’s path; the angle of the cone depends on the refractive index of the medium, n, and the particle’s velocity, v. The higher the velocity, the larger the angle, θ, with cosθ = c/nv, where c is the speed of light in free space. For Cherenkov telescopes the medium used is the Earth’s atmosphere, and in gamma-ray showers the particles are primarily electrons and positrons travelling close to the limiting velocity, c.

The showers develop to contain a maximum number of particles around 10 km above sea level; the Cherenkov light that they emit forms a “disc” typically a metre or so thick, with a diameter of about 250 m when it arrives at the Earth’s surface. This disc of light is like an image of the shower. It contains essential information about the direction and the energy of the original gamma-ray and – because gamma-rays are uninfluenced by magnetic fields in space – in effect points back to the gamma-ray source.

An imaging atmospheric Cherenkov telescope with its axis pointing in the direction of the source will intercept a small part of the disc of Cherenkov light and form an image of it at the focal point. The main challenge lies in detecting very low intensity light at the level of single photons, because the Cherenkov radiation from the shower is spread across the whole disc. Moreover, the showers from charged primary cosmic rays (hadrons, mainly protons) produce a substantial background with a rate some 10,000 times greater than that of gamma-ray-induced showers. Fortunately, the shape and structure of the two types of shower differ sufficiently for the image in the telescope to have different characteristics. Appropriate image-analysis techniques can ultimately reject the unwanted hadronic showers.

Bridging the energy gap

The Cherenkov radiation from cosmic-ray showers constitutes only about 0.01% of the light in the night sky – but it is detectable, as Bill Galbraith and John Jelley first showed at Harwell in the UK in 1953 with not much more than a dustbin with a 60 cm diameter mirror and a photomultiplier tube (PMT) at its focus. Using the same principle each MAGIC telescope, with its diameter of 17 m, has an array of hundreds of PMTs at the focus – 576 in the case of MAGIC-I and 1039 for MAGIC-II.

The potential for observing cosmic gamma-ray sources through the detection of air showers was first suggested in 1959 by Giuseppe Cocconi, who also proposed that the Crab nebula should be a strong source of high-energy gamma rays. This inspired Aleksandr Chudakov to build a pioneering Cherenkov telescope in Crimea in the early 1960s. It took 30 years before Trevor Weekes and colleagues could finally claim observation of the Crab with the Whipple imaging air Cherenkov telescope in 1989. With its 10 m segmented mirror viewed by an array of PMTs, Whipple pioneered the use of this technique in studies of the gamma-ray sky at energies from around 100 GeV to 10 TeV. It discovered the first source of gamma rays beyond our galaxy, with the detection of very high-energy emission from the active galaxy Markarian 421 (Mkn 421).

Around the same time the High-Energy Gamma-Ray Astronomy (HEGRA) project was also observing air showers with a range of detectors at the Roque de Los Muchachos Observatory. These included five atmospheric Cherenkov telescopes, each with an area of 8.5 m2, which operated in coincidence to achieve better angular resolution and a much improved rejection of background, in particular from hadron showers. The system successfully detected gamma rays up to more than 10 TeV in energy, emitted by the active galactic nuclei Mkn 421 and Mkn 501, which are prime examples of the variable and intense gamma-ray sources known as “blazars”.

CCmag3_05_09

While Whipple and HEGRA searched for sources of very high-energy gamma rays, the Energetic Gamma Ray Emission Telescope (EGRET) on board NASA’s Compton Gamma Ray Observatory was collecting a wealth of data on the gamma-ray sky at lower energies, from 20 MeV to 10 GeV. Being above the Earth’s atmosphere it could detect gamma-rays directly. However, with a small detection area and because the number of gamma rays per unit area falls steeply with energy, a small detector becomes inefficient at higher energies, with a practical limit of about 10 GeV.

It was around this time that Lorenz, who was a member of the HEGRA project, began to dream of bridging the gap in energy accessed by the ground-based and space-based instruments. This would require a larger-area mirror to collect more light, making it more sensitive to showers from the gamma rays below 100 GeV; the minimum detectable energy varies more or less inversely with the area of the mirror. At first the idea did not seem too promising because a large telescope looked likely to cost as much as a satellite. However, Lorenz and colleagues discovered that the German solar-power research programme had built a 17 m reflector dish using a relatively simple construction – and the first ideas for the MAGIC telescope were soon sketched out in a Munich beer garden.

Making MAGIC

The main features of the 17 m telescope for MAGIC were clear from the outset. It had to be lightweight to react and move quickly in searches for gamma-ray bursts (GRBs) – the puzzling, powerful phenomena discovered some 30 years ago. At the same time the structure had to be rigid enough to avoid deformations. The chosen solution was to build a framework of carbon-fibre tubes and the reflector was constructed of light-weight aluminium mirrors, with diamond-machined surfaces and an active control system to adjust each mirror to counteract any small deformations arising in the frame. In addition, the aim was to use the telescope to collect as much data as possible; on moonlit nights and at large zenith angles, close to the horizon, where the Cherenkov radiation reddens, like sunlight, as it travels farther through the atmosphere to the detector. This would require novel phototubes with high quantum efficiency to improve on the light collection and to increase sensitivity

CCmag4_05_09

Lorenz first presented MAGIC publicly at the International Cosmic Ray Conference in Rome in 1995. The project had many new ideas – possibly too many. It initially met with resistance: critics said that the construction was too light and it would blow over; the carbon fibre would be too expensive and so on. However, there were supporters such as the Italian National Institute for High Energy Physics (INFN) and the Spanish Institute for High Energy Physics (IFAE), which joined the project in 1997, took part in the R&D and participated in the technical design report published in 1998. By this time the collaboration counted nearly 50 members mainly from Germany, Italy and Spain.

The eventual site for MAGIC was undecided at the time of the technical proposal but it was evident that, like other Cherenkov telescopes, it should be at high altitude in a location with skies clear enough to “see” the faint Cherenkov light. The site at 2200 m on La Palma, already used for HEGRA, had the added advantage of offering relatively stable temperatures, which is important for minimizing thermal stress on the telescope structure.

Funding for construction on La Palma was approved towards the end of 2000, although MAGIC had already benefited from a misfortune that had befallen HEGRA. In 1997 a forest fire had destroyed one third of the detectors; they were insured, and the insurance company stipulated that the money had to be used for ongoing research.

Construction of MAGIC-I began in August 2001 and its inauguration took place on October 2003. The telescope has since observed dozens of high-energy gamma-ray sources: mainly active galactic nuclei like Mkn 421 and Mkn 501 and nebulae around pulsars, but also supernova remnants and binary systems.

CCmag5_05_09

The observations include impressive “firsts” and exciting discoveries. On 13 July 2005, for example, the telescope demonstrated its ability to respond rapidly to a GRB alert from NASA’s Swift satellite, locating GRB050713A only 40 s after its explosion. This allowed the first simultaneous observation of a GRB in both high-energy gamma rays and X-rays. In June 2006 the collaboration reported the detection of variable high-energy gamma-ray emission from the microquasar LSI+61 303, a gravitationally bound binary-star system consisting of a massive ordinary star and a compact object of a few solar masses. More recently, in June 2008, MAGIC discovered gamma-ray emission from 3C 279, a quasar more than 5000 million light-years from Earth – making it the most distant source of very high-energy gamma rays yet known. Detecting the gamma radiation from so far away poses interesting questions because, over such great distances, even gamma rays should interact with the background light from stars and galaxies. The universe, it seems, is darker than current theories suggest.

MAGIC-I was always viewed as the project’s first telescope, which would focus resources on an advanced design aiming at as low a detection energy as possible, while maximizing the potential for discoveries. Its success laid the foundations for MAGIC-II – a “twin” to allow studies of greater sensitivity and precision. The design began in 2005 and the construction was finally completed in 2008.

CCmag6_05_09

In designing MAGIC-II, the collaboration has benefited both from the experience with MAGIC-I and from technological developments. While the mounts of the two telescopes are essentially the same, the differences lie in the reflecting surface and in particular in the PMTs for the “camera”. MAGIC-II has the same overall surface as MAGIC-I, but is made of fewer, larger plates: 140 1m2 diamond-milled aluminium plates, with 100 additional coated glass mirrors at the outer edges. While the aluminium mirrors have some excellent properties, their technology is not easy to extend to mass production. For MAGIC-II the collaboration turned to glass mirrors and formed a partnership with industry to trial the production of a total of 100 m2.

The camera for MAGIC-II has more smaller-size PMTs of a new design with 10% higher quantum efficiency. MAGIC-I has 396 1″ PMTs to cover the inner area, surrounded by 180 1.5″ PMTs for the outer region. With 1039 1″ PMTs, the MAGIC-II camera covers the same area with more pixels and hence has higher resolution.

Working alone in a special low-energy mode, MAGIC-I has already observed gamma rays down to 25 GeV. Operating in unison, the two telescopes will provide better coverage of such low energies and see deeper into the Universe. The pioneering space-borne EGRET now has a more powerful successor in the Large Area Telescope (LAT) on the recently launched  Fermi Gamma-Ray Space Telescope, which has an energy range from 20 MeV up to 300 GeV. With the MAGIC twins and the Fermi-LAT, Lorenz’s dream of closing the energy gap is coming close to realization.

The MAGIC collaboration currently consists of some 150 researchers from 24 institutes in Croatia, Bulgaria, Finland, Germany, Italy, Poland, Spain, Switzerland and the US.

Spin and snakes come to the land of Jefferson

CCspi1_05_09

The International Spin Physics Symposia series started at the Argonne National Laboratory in 1974, just after its 12 GeV Zero Gradient Synchrotron (ZGS) had accelerated the world’s first polarized proton beam. Paul Dirac gave the keynote lecture in which he reviewed the history of spin, starting with the first ideas in the 1920s. The 18th Symposium, SPIN 2008, was held in October 2008 at the University of Virginia, which was founded by Thomas Jefferson in 1821. Jefferson became third president of the US and is best known as the main author of the US Declaration of Independence. He had a keen interest in science and was president of the American Philosophical Society. As a fitting tribute, an appropriately-dressed person – who claimed to be Jefferson – gave the after-dinner talk at SPIN 2008. Speaking with a polite 1800s Virginian accent, he gave wise scientific advice that is as relevant now as it was in the early 19th century.

Symposia highlights

SPIN 2008 was attended by 282 high-energy and nuclear spin physicists from around the world. There were 195 parallel talks and 37 plenary talks, so this report mentions only a few of the exciting highlights. After the welcoming talks, the ever-enthusiastic Elliot Leader of Imperial College, London, opened the symposium with a rousing lecture on “The power of spin: a scalpel-like probe of theoretical ideas”. He was followed by Klaus Rith of Erlangen, who gave a detailed experimental overview in his talk that addressed selected highlights of spin experiments and their technological challenges.

CCspi2_05_09

The main highlight of the symposium was the success of Brookhaven’s RHIC in its operations as a polarized-proton collider. The machine has produced a great deal of high-quality data in 100 GeV-on-100 GeV collisions and had a brief but successful test of stored 250 GeV polarized protons. For this impressive achievement, Thomas Roser, Mei Bai and their team of polarized-beam experts used two Siberian snakes in each RHIC ring together with two partial Siberian snakes in the venerable Alternating Gradient Synchrotron, which serves as the injector for RHIC. These operations were possible thanks to some vital external contributions. James Simons, mathematics professor at Stony Brook, a Brookhaven trustee and now a “renaissance-technologies” billionaire, provided $13 million to allow a 6 month polarized run of RHIC. Moreover, the long-term support of Akito Arima – a nuclear theorist who became a member of the Japanese Diet and science minister – resulted in more than $20 million for RHIC’s four superconducting Siberian snakes and other essential hardware. The funds were transferred from Japan to Brookhaven via the RIKEN research institute.

One interesting result was the measurement by the BRAHMS experiment at RHIC of the left–right spin asymmetry, An, in the inclusive production of π+ and π mesons, which was presented by Christine Aidala of the University of Massachusetts at Amherst. The data show that, despite the prediction of perturbative QCD (PQCD) that spin would be unimportant at high energy, the inclusive An at large Feynman-x reached the same value of about 40% at 3900 GeV2 (PLab ≈ 2 TeV/c) as at Argonne’s ZGS, Brookhaven’s AGS and Fermilab at momentum values of 12, 22 and 200 GeV/c, respectively (figure 1). This result might encourage PQCD theorists to define more clearly what is meant by “high” energy.

CCspi3_05_09

In an overview of the transverse spin structure of the nucleon, Mario Anselmino of Turin reported on the very large observed transverse spin effects, which are still not fully understood. Karl Slifer of the University of Virginia gave a talk on what polarized electron scattering has revealed about the spin content of the nucleon. He discussed the theoretical implications of recent polarized-electron experiments, many of which were done at the 6 GeV polarized-electron ring at the Thomas Jefferson National Accelerator Facility (Jefferson Lab). The use of polarized radioactive beams was the subject of an interesting talk by Koichiro Asahi of Tokyo Institute of Technology.

Speakers also covered the more experimental aspects of spin studies, namely the production of polarized beams and polarized targets. Richard Milner of the Massachusetts Institute of Technology gave an excellent review of the progress towards a future polarized-electron ring, which would allow collisions with either polarized protons or polarized nuclear ions stored in a much larger ring – possibly one of the rings at RHIC. Erhard Steffens of Erlangen, the new chair-elect of the International Spin Physics Committee, summarized the discussions at the second workshop on ‘How to Polarize Antiprotons’, held in August 2007 at the Cockcroft Institute at the Daresbury Laboratory in the UK. There has been significant progress on this challenging topic during the 22 years since Owen Chamberlain and Alan Krisch organized the first Polarized Antiproton Workshop in 1985 at Bodega Bay near Berkeley, but there is still no clearly defined solution to this difficult problem.

CCspi4_05_09

Werner Meyer of Bochum reviewed the continuing progress, since SPIN 2006 in Kyoto, on cryogenic polarized proton and deuteron targets. Brookhaven’s Anatoly Zelenski then described the recent progress on polarized-ion sources – the subject of a joint paper with Alexander Belov of the Institute for Nuclear Research, Troitsk. This progress is important because these polarized sources feed RHIC. Indeed, Brookhaven and the Spin Physics Committee sponsored a recent workshop on this topic at Brookhaven. Matt Poelker of Jefferson Lab reviewed progress on polarized-electron sources and polarimeters, which was the subject of another recent spin workshop at the laboratory. These sources and polarimeters are vital to progress in polarized-electron experiments.

Anatoly Kondratenko of Novosibirsk, who along with Yaroslav Derbenev invented Siberian snakes in the 1970s, gave an interesting talk on his more recent idea, now named Kondratenko Crossing (KC). This uses a symmetric spin-resonance crossing pattern that forces the resonance’s depolarizing effects to cancel themselves. Richard Raymond of Michigan reported on new data from the SPIN@COSY team at the Cooler Synchrotron (COSY) at the Forschungszentrum in Jülich. The results show that KC works, at least for RF-solenoid-induced resonances with deuterons. In the same parallel session accelerator pioneer Ernest Courant, of Brookhaven, still going strong at 89, discussed his new theoretical work on the behaviour of stored polarized beams.

There was also a special evening plenary session where the director (or proxy) of each of the major laboratories involved in spin-physics studies reported on their plans at GSI-Darmstadt, Brookhaven, Jefferson Lab, IHEP-Protvino, JINR-Dubna, J-PARC, and COSY-Jülich. On the last day, Thomas Roser, the new past-chair of the Spin Physics Committee, gave an excellent lecture on the future of high-energy polarized beams. The symposium ended with closing remarks from committee chair, Kenichi Imai of Kyoto. He announced that SPIN 2010 would be hosted by Forschungszentrum Jülich, while a high priority would be given to SPIN 2012 being hosted somewhere in Russia.

• For more information about SPIN 2008, see www.faculty.virginia.edu/spin2008/.

Big Science, bigger outreach

CCvie1_05_09

In 1609 Galileo Galilei made the first recorded telescope observations of the night sky – an event that is being celebrated all through 2009 in the International Year of Astronomy. He soon ran into trouble with the ecclesiastical authorities, partly because he used Italian instead of Latin in many of his letters and books, which gave people access to his new scientific interpretation of the world. Fortunately things have changed since then and today the scientific community and the relevant authorities on scientific policies share a general consensus on the importance of conveying to society the main results and general consequences of research.

Take high-energy physics as an example: over the past few years, dedicated working groups and projects have been set up to develop outreach activities. A good example of an annual activity of this kind, aimed at young students in physics and high-school teachers, is the EPPOG Masterclasses, which involves the participation of some 80 research institutes and universities across Europe. Recently several African and American institutes joined the project.

Permanent or travelling exhibitions are another interesting means for the “large-scale” dissemination of high-energy physics information, showing the public the still-unresolved mysteries of the universe and the gigantic equipment needed in particle accelerators (such as the LHC), detectors and computing systems (such as the Grid).

These valuable initiatives have been unquestionably successful but their real reach to society is limited because of the relatively reduced number of participants and the competition from other fields (scientific or not) already on the market, such as websites, video games and so on. We can think of taking advantage of these more loosely related activities such as the film adaptation of Dan Brown’s bestselling novel, Angels & Demons, currently in cinemas around the world. While artists should be allowed creative freedom and their view on science should not be rejected, they sometimes risk being somewhat misleading. I am not particularly enthusiastic about spreading the idea that a bomb made of antimatter stolen from CERN could destroy the Vatican (or any other city). Nonetheless, the association of physics (and more generally, science) with other social and cultural manifestations should be mutually beneficial and deserves closer attention.

Despite the universality of its principles, methodology and objectivity of results, the advancement of scientific knowledge has proved to be socially dependent, from the golden age of Pericles and the “invention” of democracy to the Renaissance and the rise of humanism together with the birth of modern science. Society itself may fuel scientific advancement in a particular direction: thermodynamics was driven by the need for building more efficient heat engines at the beginning of Industrial Revolution, thereby decisively contributing to the foundations of classical physics in the 19th century.

As an example from particle physics, CERN was created as a free forum for nuclear science in a Europe devastated by the Second World War “to encourage the formation of research laboratories in order to increase international scientific collaboration&ellip;” (as stated at the Fifth UNESCO Conference in Florence, 1950). The CERN convention was gradually ratified during 1953–54 by the 12 founding member states, while the Treaty of Rome that founded the European Union was signed in 1957. Science often goes ahead of society.

In this regard the Web – born at CERN – has represented a dramatic democratization of knowledge, teaching and information. Virtually free for everybody on the planet (wherever electricity is available), it was an almost direct consequence of the free circulation of scientific data among researchers. More generally, big scientific collaborations are genuine examples of worldwide co-operation between different scientists and technicians regardless of their age, gender, religious beliefs or nationality.

Social needs, in turn, continuously demand technological achievements that ultimately stem from fundamental research. Nuclear and particle physics, for instance, have provided crucial tools for medical diagnosis, from the discovery of X-rays to modern medical-imaging techniques.

Undoubtedly outreach must convey to society the excitement of scientific discovery and the importance of technological returns. However, in my opinion, the message from science should not stop there. Galileo’s Sidereus Nuncius (Sidereal Messenger) was heralding in 1610 not only the existence of mountains on the Moon or satellites around Jupiter, but also the dawn of a new epoch. Indeed, the social impact and controversy turned out to be much greater than with De revolutionibus by Nicolaus Copernicus (1543) or Johannes Kepler’s Astronomia nova (1609) – because it was easier to read.

We are currently witnessing crucial developments of society globally, from a more just economy to extended human rights, environmental protection and nature conservation. While keeping the possible misuse of scientific and technological applications in mind as a warning, we should ensure that the virtues that are traditionally associated with “Big Science”, historically entangled in the social progress of humanity, are praised as an example to counteract ignorance, obscurantism and fanaticism.

FFAGs enter the applications era

CCnew1_04_09

After a series of preliminary tests held on 26–27 February the Research Reactor Institute of Kyoto University (KURRI) received a national licence to conduct experiments for an accelerator-driven subcritical reactor (ADSR) using the Kyoto University Critical Assembly (KUCA). The first ADSR experiment began on 4 March using a newly developed fixed-field alternating-gradient (FFAG) proton accelerator connected to the KUCA. This marks the first use of an FFAG accelerator built for a specific application rather than as a prototype, and it heralds the start of a new era.

The Development of an Accelerator Driven Subcritical Reactor using an FFAG Proton Accelerator project, which is now reaching its goal, was initiated in 2002 under a contract with the Ministry of Education, Culture, Sports, Science and Technology (MEXT) as part of the Technology Development Project for Innovative Nuclear Energy Systems. In the experiment the FFAG accelerator provides a high-energy proton beam to a heavy-metal target in the KUCA to produce spallation neutrons, which in turn drive fission chain reactions in the KUCA-A Core.

The aim is to examine the feasibility of an ADSR and to lay the foundations for its development. The fact that the reactor is driven slightly below criticality makes this system intrinsically safe: as soon as the external neutron supply is stopped, fission chain reactions cease. ADSRs may also be useful for the transmutation of long-lived transuranic elements into shorter-lived or stable elements. They therefore have the potential to be used as energy amplifiers, neutron sources and transmutation systems.

bright-rec iop pub iop-science physcis connect