An excess of gamma rays at energies of a few GeV was found to be a good candidate for a dark-matter signal. Two years later, a pair of research articles refute this interpretation by showing that the excess photons detected by the Fermi Gamma-ray Space Telescope are not smoothly distributed as expected for dark-matter annihilation. Their clustering reveals instead a population of unresolved point sources, likely millisecond pulsars.
The Milky Way is thought to be embedded in a dark-matter halo with a density gradient increasing towards the galactic centre. The central region of our Galaxy is therefore a prime target to find an electromagnetic signal from dark-matter annihilation. If dark matter is made of weakly interacting massive particles (WIMPs) heavier than protons, such a signal would naturally be in the GeV energy band. A diffuse gamma-ray emission detected by the Fermi satellite and having properties compatible with a dark-matter origin created hope in recent years of finally detecting this elusive form of matter more directly than only through gravitational effects.
Two independent studies published in Physical Review Letters are now disproving this interpretation. Using different statistical-analysis methods, the two research teams found that the gamma rays of the excess emission at the galactic centre are not distributed as expected from dark matter. They both find evidence for a population of unresolved point sources instead of a smooth distribution.
The study, led by Richard Bartels of the University of Amsterdam, the Netherlands, uses a wavelet transformation of the Fermi gamma-ray images. The technique consists of a convolution of the photon count map with a wavelet kernel shaped like a Mexican hat, with a width tuned near the Fermi angular resolution of 0.4° in the relevant energy band of 1–4 GeV. The intensity distribution of the derived wavelet peaks is found to be inconsistent with that expected from a truly diffuse origin of the emission. The distribution suggests instead that the entire excess emission is due to a population of mostly undetected point sources with characteristics matching those of millisecond pulsars.
In the coming decade, new facilities at radio frequencies will be able to detect hundreds of new millisecond pulsars in the central region of the Milky Way.
These results are corroborated by another study led by Samuel Lee of the Broad Institute in Cambridge and Princeton University. This US team used a new statistical method – called a non-Poissonian template fit – to estimate the contribution of unresolved point sources to the gamma-ray excess emission at the galactic centre. The team’s results predict a new population of hundreds of point sources hiding below the detection threshold of Fermi. The possibility of detecting the brightest ones in the years to come with ongoing observations would confirm this prediction.
In the coming decade, new facilities at radio frequencies will be able to detect hundreds of new millisecond pulsars in the central region of the Milky Way. This would definitively rule out the dark-matter interpretation of the GeV excess seen by Fermi. In the meantime, the quest towards identifying the nature of dark matter will go on, but little by little the possibilities are narrowing down.
On 11 February, the Laser Interferometer Gravitational-Wave Observatory (LIGO) and Virgo collaborations published a historic paper in which they showed a gravitational signal emitted by the merger of two black holes. The signal has been observed with 5σ significance and is the first direct observation of gravitational waves.
This result comes after 20 years of hard work by a large collaboration of scientists operating the two LIGO observatories in the US. Barry Barish, Linde professor of physics, emeritus, at the California Institute of Technology and former director of the Global Design Effort for the International Linear Collider (ILC), led the LIGO endeavour from 1994 to 2005. On the day of the official announcement to the scientific community and the public, Barish was at CERN to give a landmark seminar that captivated the whole audience gathered in the packed Main Auditorium.
The CERN Courier had the unique opportunity to interview Barish just after the announcement.
Professor Barish, this achievement comes after 20 years of hard work, uncertainties and challenges. This is what research is all about, but what was the greatest challenge you had to overcome during this long period?
It really was to do anything that takes 20 years and still be supported and have the energy to reach completion. We started long before that, but the project itself started in 1994. LIGO is an incredible technical achievement. The idea that you can take on high risk in such a scientific endeavour requires a lot of support, diligence and perseverance. In 1994, we convinced the US National Science Foundation to fund the project, which became the biggest programme to be funded. After that, it took us 10 years to build it and to make it work well, plus 10 years to improve the sensitivity and bring it to the point where we were able to detect the gravitational waves. And along the way no one had done this before.
Indeed, the experimental set-up we used to detect the gravitational signal is an enormous extrapolation from anything that was done before. As a physicist, you learn that extrapolating a factor of two can be within reach, but a factor of 10 sounds already like a dream. If you compare the first 40 m interferometer we built on the CALTECH campus with the two 4000 m interferometers we have now, you already have an idea of the enormous leap we had to make. The leap of 100 in size also involved at least that in complexity and sophistication, eventually achieving more than 10,000 times the sensitivity of the original 40 m prototype.
The two signals were perfectly consistent, and this gave us total trust in our data.
The experimental confirmation of the existence of the gravitational waves could have a profound impact on the future of astrophysics and gravitational physics. What do you think are the most important consequences of the discovery?
The discovery opens two new areas of research for physics. One is on the general-relativity theory itself. Gravitational waves are a powerful way of testing the heart of the theory by investigating the strong-field realm of gravitational physics. Even with just this first event – the merging of two black holes – we have created a true laboratory where you can study all of this, and understanding general relativity at an absolutely fundamental level is now opening up.
The second huge consequence of the discovery is that we can now look at the universe with a completely new “telescope”. So far, we have used and built all kinds of telescopes: infrared, ultraviolet, radio, optical… And the idea of recent years has been to look at the same things in different bandwidths.
However, no such previous instrument could have seen what we saw with the LIGO interferometers. Nature has been so generous with us that the very first event we have seen is new astrophysics, as astronomers had never seen stellar black holes of these masses. With just the first glimpse at the universe with gravitational waves, we now know that they exist in pairs and that they can merge. This is all new astrophysics. When we designed LIGO, we thought that the first thing we would see gravitational waves emitted by was neutron stars. It would still be a huge discovery, but it would not be new astrophysical information. We have been really lucky.
Over the next century, this field will provide a completely new way of doing an incredible amount of new science. And somehow we had a glimpse of that with the first single event.
What were your feelings upon seeing the event on your screen?
We initially thought that it could be some instrumental crazy thing. We had to worry about many possible instrumental glitches, including whether someone had purposely injected a fake event into our data stream. To carefully check the origin of the signal, we tracked back the formation of the event data from the two interferometers, and we could see that the signal was recorded within seven milliseconds – exactly the time we expect for the same event to appear on the second interferometer. The two signals were perfectly consistent, and this gave us total trust in our data.
I must admit that I was personally worried as, in physics, it is always very dangerous to claim anything with only one event. However, we proceeded to perform the analysis in the most rigorous way and, indeed, we followed the normal publication path, namely the submission of the paper to the referees. They confirmed that what we submitted was scientifically well-justified. In this way, we had the green light to announcing the discovery to the public.
At the seminar you were welcomed very warmly by the audience. It was a great honour for the CERN audience to have you give the talk in person, just after your colleagues’ announcement in the US. What are you bringing back from this experience?
I was very happy to be presenting this important achievement in the temple of science. The thing that made me feel that we made the case well was that people were interested in what we have done and are doing. In the packed audience, nobody seemed to question our methodology, analysis or the validity of our result. We have one single event, but this was good enough to convince me and also my colleagues that it was a true discovery. I enjoyed receiving all of the science questions from the audience – it was really a great moment for me.
• The LIGO and Virgo collaborations are currently working on analysing the rest of the data from the run that ended on 12 January. New information is expected to be published in the coming months. In the meantime, the discovery event is available in open data (see https://losc.ligo.org) for anyone who wants to analyse it.
Faire de la recherche avec un cyclotron PET médical
Au-delà de la production courante de radio-isotopes pour l’imagerie médicale, les cyclotrons PET compacts peuvent être au cœur d’installations de recherche multidisciplinaires. C’est le cas au laboratoire cyclotron de Berne (BTL), conçu pour une utilisation de l’accélérateur dans des buts scientifiques, parallèlement à la production de radio-isotopes. Au fil des années, l’installation est devenue le principal instrument pour toute une série d’activités de recherche auxquelles participent des équipes de physiciens, de chimistes, de pharmaciens et de biologistes.
Particle accelerators are fundamental instruments in modern medicine, where they are used to study the human body and to detect and cure its diseases. Instrumentation issued by fundamental research in physics is very common in hospitals. This includes positron emission tomography (PET) and cancer hadrontherapy.
To match the needs of a continuously evolving field and to fulfil the stringent requirements of hospital-based installations, specific particle accelerators have been developed in recent years. In particular, modern medical cyclotrons devoted to proton cancer treatments and to the production of radioisotopes for diagnostics and therapy are compact, user-friendly, affordable and able to ensure very high performance.
Medical PET cyclotrons usually run during the night or early in the morning, for the production of radiotracers that will be used for imaging. Their beams, featuring about 20 MeV energy and currents of the order of 100 μA, are in principle available for other purposes during the daytime. This represents an opportunity to exploit the science potential of these accelerators well beyond medical-imaging applications. In particular, they can be optimised to produce beams in the picoampere and nanoampere range, opening the way to nuclear and detector physics, material science, radiation biophysics, and radiation-protection research.
On the other hand, to perform cutting-edge multidisciplinary research, beams of variable shape and intensity must be available, together with the possibility to access the beam area. This cannot be realised in standard medical PET cyclotron set-ups, where severe access limitations occur due to radiation-protection issues. Furthermore, the targets for the production of PET radioisotopes are directly mounted on the cyclotron right after extraction, with consequent limitations in the use of the beams. To overcome these problems, medical PET cyclotrons can be equipped with a transport line leading the beam to a second bunker, which can always be accessible for scientific activities.
The Bern cyclotron laboratory
The Bern medical PET cyclotron laboratory was conceived to use the accelerator for scientific purposes in parallel with radioisotope production. It is situated in the campus of the Inselspital, the Bern University hospital, and has been in operation since 2013. The heart of the facility consists of an 18 MeV cyclotron providing single or dual beams of H– ions. A maximum extracted current of 150 μA is obtained by stripping the negative ions. Targets can be located in eight different out-ports. Four of them are used for fluorine-18 production, one is equipped with a solid target station, and one is connected to a 6 m-long beam transfer line (BTL). The accelerator is located inside a bunker, while a second bunker with independent access hosts the BTL and is fully dedicated to research. The beam optics of the BTL is realised by one horizontal and one vertical steering magnet, together with two quadrupole doublets, one in the cyclotron bunker and the other in the research area. A neutron shutter prevents neutrons from entering the research bunker during routine production, avoiding radiation damage to scientific instrumentation. The BTL, rather unusually for a hospital cyclotron, represents the pillar of this facility. Although initially more expensive than a standard PET cyclotron facility, this solution ensures complete exploitation of the accelerator beam time and allows for synergy among academic, clinical and industrial partners.
Multidisciplinary research activities
The Bern facility carries out full, multidisciplinary research activities by a team of physicists, chemists, pharmacists and biologists. The BTL and the related physics laboratory have so far been the main instrument for carrying out research on particle detectors, accelerator physics, radiation protection, and novel radioisotopes for diagnostics and therapy.
To reach beam currents down to the picoampere range, a specific method was developed based on tuning the ion source, the radiofrequency and the current in the main coil. These currents are far below those employed for radioisotope production, and PET cyclotrons are not equipped with sensitive enough instrumentation. A novel compact-profile monitor detector was conceived and built to measure, control and use these low-intensity beams. A scintillating fibre crossing the beam produces light that can be collected to measure its profile. Specific doped-silica scintillating fibres were produced in collaboration with the Institute of Applied Physics (IAP) in Bern. A wide-intensity-range beam-monitoring detector was realised, able to span currents from 1 pA to 20 μA. The versatility of the instrument attracted the interest of industry, becoming a spin-off of the research activity. Moreover, the beam monitor was used to measure the transverse beam emittance of cyclotrons, opening the way to further accelerator-physics developments.
The large amount of daily produced fluorine-18 requires a complex radiation-protection monitoring system consisting of about 40 detectors. Besides γ and neutron monitoring, special care is paid to air contamination – a potential danger for workers and the population. This system is both a safety and research tool. Radioactivity induced in the air by proton and neutron beams was studied and the produced activity measured. The results were in good agreement with calculations based on excitation functions, and can be used for the assessment of radioactivity induced in air by proton and neutron beams in the energy range of PET cyclotrons. A direct application of this study is the assessment of radiation protection for scientific activities requiring beam extraction into air.
Another distinctive feature of the Bern cyclotron is its radio-pharmacy, conceived to bring together industrial production for medicine and scientific research. It features three Good Manufacturing Practice (GMP)-qualified laboratories, among which one is fully devoted to research. The existence of this laboratory and of the BTL brought together physicists and radiochemists of the University of Bern and of the Paul Scherrer Institute (PSI), triggering a multidisciplinary project funded by the Swiss National Science Foundation (SNSF). Scandium-43 is proposed as a novel PET radioisotope, having nearly ideal nuclear-decay properties for PET. Furthermore, scandium is suitable for theranostics (combined diagnostics and therapy). The same biomolecule can in fact be labelled with a positron-emitting isotope for imaging and a β– one for cancer therapy. Advances in nuclear medicine will only be possible if suitable quantities of scandium-43 are available. The goal of the project is to produce clinically relevant amounts of this radioisotope with a quality appropriate for clinical trials.
The results described above represent examples of the wide spectrum of research activities that can be pursued at the Bern facility. Several other fields can be addressed, such as the study of materials by PIXE and PIGE ion-beam analysis, irradiation of biological samples, and investigation of the radiation hardness of scientific instrumentation.
The organisation of a facility of this kind naturally triggers national and international collaborations. The 12th workshop of the European Cyclotron Network (CYCLEUR) will take place in Bern on 23–24 June 2016, to bring together international experts. Last but not least, students and young researchers can profit from unique training opportunities in a stimulating, multidisciplinary environment, to move towards further advances in the application of particle-physics technologies.
The ALICE experiment is devoted to the study of strongly interacting matter, where temperatures are sufficiently high to overcome hadronic confinement, and the effective degrees of freedom are governed by quasi-free quarks and gluons. This type of matter, known as quark–gluon plasma (QGP), has been produced in collisions of lead ions at the LHC since 2010. The detectors of the ALICE central barrel aim to provide a complete reconstruction of the final state of Pb–Pb collisions, including charged-particle tracking and particle identification (PID). The latter is done by measuring the specific ionisation energy loss, dE/dx.
The main tracking and PID device is the ALICE time projection chamber (TPC). With an active volume of almost 90 m3, the ALICE TPC is the largest detector of its type ever built. During the LHC’s Runs 1 and 2, the TPC reached or even exceeded its design specifications in terms of track reconstruction, momentum resolution and PID capabilities.
ALICE is planning a substantial detector upgrade during the LHC’s second long shutdown, including a new inner tracking system and an upgrade of the TPC. This upgrade will allow the experiment to overcome the TPC’s essential limitation, which is the intrinsic dead time imposed by an active ion-gating scheme. In essence, the event rate with the upgraded TPC in LHC Run 3 will exceed the present one by about a factor of 100.
The rate limitation of the current ALICE TPC arises from the use of a so-called gating grid (GG) – a plane of wires installed in the MWPC-based read-out chambers. The GG is switched by an external pulser system from opaque to transparent mode and back. In the presence of an event trigger, the GG opens for a time window of 100 μs, which allows all ionisation electrons from the drift volume to enter the amplification region. On the other hand, slow-moving ions produced in the avalanche process head back into the drift volume. Therefore, after each event, the GG has to stay closed for 300–500 μs to keep the drift volume free of large space-charge accumulations, which would create massive drift-field distortions. This leads to an intrinsic read-out rate limitation of a few kHz for the current TPC. However, it should be noted that the read-out rate in Pb–Pb collisions is currently limited by the bandwidth of the TPC read-out electronics to a few hundred Hz.
In Run 3, the LHC is expected to deliver Pb–Pb collision rates of about 50 kHz, implying average pile-up of about five collision events within the drift time window of the TPC. Moreover, many of the key physics observables are on low-transverse-momentum scales, implying small signal-over-background ratios, which make conventional triggering schemes inappropriate. Hence, the upgrade of the TPC aims at a triggerless, continuous read-out of all collision events. Operating the TPC in such a video-like mode makes it necessary to exchange the present MWPC-based read-out chambers for a different technology, which eliminates the necessity of active ion gating, also including complete replacement of the front-end electronics and read-out system.
The main challenge for the new read-out chambers is the requirement of large opacity for back-drifting ions, combined with high efficiency to collect ionisation electrons from the drift volume into the amplification region, to maintain the necessary energy resolution. To allow for continuous operation without gating, both requirements must be fulfilled at the same potential setting. In an extensive R&D effort, conducted in close co-operation with CERN’s RD51 collaboration, it was demonstrated that these specific requirements can be reached in an amplification scheme that employs four layers of gas electron multiplier (GEM) foils, a technology that was put forward by Fabio Sauli and collaborators in the 1990s.
A schematic view of a 4-GEM stack is shown on the previous page (figure 1). Optimal performance is reached in a setting where the amplification voltages ∆V across the GEMs increase from layer 1 to 4. This maximises the average number of GEMs that the produced ions have to pass on their way towards the drift volume, hence giving rise to minimal ion-escape probability. Moreover, the electron transparency and ion opacity can be optimised by a suitable combination of high and low transfer fields ET. Finally, the hole pitch of the GEM foils has proven to be an important parameter for the electron and ion transport properties, leading to a solution where two so-called standard-pitch GEMs (S, hole-pitch 140 μm) in layers 1 and 4 sandwich two GEMs with larger pitch (LP, hole-pitch 280 μm) in layers 2 and 3.
After being developed in small-size prototype tests in the laboratory, a full-size TPC inner read-out chamber (IROC) with 4-GEM read-out was built and tested in beams at the PS and SPS. To this end, large-size GEM foils were produced at the CERN PH-DT Micro-Pattern Technologies Workshop, in so-called single-mask technology (figure 2). As a main result of the test-beam campaigns, the dE/dx performance of the 4-GEM IROC was demonstrated to be the same as for the existing MWPC IROCs, and the stability against discharge is well suited for operation at the LHC in Run 3 and beyond.
After approval of the Technical Design Report by the LHC Experiments Committee, and an in-depth Engineering Design Review of the new read-out chambers in 2015, the TPC upgrade project is presently in its pre-production phase, aiming to start mass production this summer.
Accurate knowledge of the interaction probability of neutrons with nuclei is a key parameter in many fields of research. At CERN, pulsed bunches from the Proton Synchrotron (PS) hit the spallation target and produce beams of neutrons with unique characteristics. This allows scientists to perform high-resolution measurements, particularly on radioactive samples.
The story of the n_TOF facility goes back to 1998, when Carlo Rubbia and colleagues proposed the idea of building a neutron facility to measure neutron-reaction data needed for the development of an energy amplifier. The facility eventually became fully operational in 2001, with a scientific programme covering neutron-induced reactions relevant for nuclear astrophysics, nuclear technology and basic nuclear science. During the first major upgrade of the facility in 2009, the old spallation target was removed and replaced by a new target with an optimised design, which included a decoupled cooling and moderation circuit that allowed the use of borated water to reduce the background due to in-beam hydrogen-capture γ rays. A second improvement was the construction of a long-awaited “class-A” workplace, which made it possible to use unsealed radioactive isotopes in the first experimental area (EAR1) at 200 m from the spallation target. In 2014, n_TOF was completed with the construction of a second, vertical beamline and a new experimental area – EAR2.
One of the most striking features of neutron–nucleus interactions is the resonance structures observed in the reaction cross-sections at low-incident neutron energies. Because the electrically neutral neutron has no Coulomb barrier to overcome, and has a negligible interaction with the electrons in matter, it can directly penetrate and interact with the atomic nucleus, even at very low kinetic energies in the order of electron-volts. The cross-sections can show variations of several orders of magnitude on an energy scale of only a few eV. The origin of these resonances is related to the excitation of nuclear states in the compound nuclear system formed by the neutron and the target nucleus, at excitation energies lying above the neutron binding energy of typically several MeV. In figure 1, the main cross-sections for a typical heavy nucleus are shown as a function of energy. The position and extent of the resonance structures depend on the nucleus. Also shown on the same energy scale are Maxwellian neutron energy distributions for fully moderated neutrons by water at room temperature, for fission neutrons, and for typical neutron spectra in the region from 5 to 100 keV, corresponding to the temperatures in stellar environments of importance for nucleosynthesis.
The wide neutron energy range is one of the key features of the n_TOF facility.
In nuclear astrophysics, an intriguing topic is understanding the formation of nuclei present in the universe and the origin of chemical elements. Hydrogen and smaller amounts of He and Li were created in the early universe by primordial nucleosynthesis. Nuclear reactions in stars are at the origin of nearly all other nuclei, and most nuclei heavier than iron are produced by neutron capture in stellar nucleosynthesis. Neutron-induced reaction cross-sections also reveal the nuclear-level structure in the vicinity of the neutron binding energy of nuclei. Insight into the properties of these levels brings crucial input to nuclear-level density models. Finally, neutron-induced reaction cross-sections are a key ingredient in applications of nuclear technology, including future developments in medical applications and the transmutation of nuclear waste, accelerator-driven systems and nuclear-fuel-cycle investigations.
The wide neutron energy range is one of the key features of the n_TOF facility. The kinetic energy of the particles is directly related to their time-of-flight: the start time is given by the impact of the proton beam on the spallation target and the arrival time is measured in the EAR1 and EAR2 experimental areas. The high neutron energies are directly related to the 20 GeV/c proton-induced spallation reactions in the lead target. Neutrons are subsequently partially moderated to cover the full energy range. Energies as low as about 10 MeV corresponding to long times of flight can be exploited and measured at n_TOF because of its pulsed bunches spaced by multiples of 1.2 s, sent by the PS. This allows long times of flight to be measured without any overlap into the next neutron cycle.
Another unique characteristic of n_TOF is the very high number of neutrons per proton burst, also called instantaneous neutron flux. In the case of research with radioactive samples irradiated with the neutron beam, the high flux results in a very favourable ratio between the number of signals due to neutron-induced reactions and those due to radioactive decay events, which contribute to the background. While the long flight path of EAR1 (200 m from the spallation target) results in a very high kinetic-energy resolution, the short flight path of EAR2 (20 m from the target) has a neutron flux that is higher than that of EAR1 by a factor of about 25. The neutron fluxes in EAR1 and EAR2 are shown in figure 2. The higher flux opens the possibility for measurements on nuclei with very low mass or low reaction cross-sections within a reasonable time. The shorter flight distance of about a factor 10 also ensures that the entire neutron energy region is measured in a 10 times shorter interval. For measurements of neutron-induced cross-sections on radioactive nuclei, this means 10 times less acquired detector signals due to radioactivity. Therefore the combination of the higher flux and the shorter time interval results in an increase of the signal-to-noise ratio of a factor 250 for radioactive samples. This characteristic of EAR2 was, for example, used in the first cross-section measurement in 2014, when the fission cross-section of the highly radioactive isotope 240Pu was successfully measured. An earlier attempt of this measurement in EAR1 was not conclusive. An example from 2015 is the measurement of the (n,α) cross-section of the also highly radioactive isotope 7Be, relevant for the cosmological Li problem in Big Bang nucleosynthesis.
The most important neutron-induced reactions that are measured at n_TOF are neutron-capture and neutron-fission reactions. Several detectors have been developed for this purpose. A 4π calorimeter consisting of 40 BaF2 crystals has been in use for capture measurements since 2004. Several types of C6D6-based liquid-scintillator detectors are also used for measurements of capture γ rays. Different detectors have been developed for charged particles. For fission measurements, ionisation chambers, parallel-plate avalanche counters and the fission-fragment spectrometer STEFF have been operational. MicroMegas-based detectors have been used for fission and (n,α) measurements. Silicon detectors for measuring (n,α) and (n,p) reactions have been developed and used more recently, even for in-beam measurements.
The measurements at CERN’s neutron time-of-flight facility n_TOF, with its unique features, contribute substantially to our knowledge of neutron-induced reactions. This goes together with cutting-edge developments in detector technology and analysis techniques, the design of challenging experiments, and training a new generation of physicists working in neutron physics. This work has been actively supported since the beginning of n_TOF by the European Framework Programmes. A future development currently being studied is a possible upgrade of the spallation target, to optimise the characteristics of the neutron beam in EAR2. The n_TOF collaboration, consisting of about 150 researchers from 40 institutes, looks forward to another year of experiments from its scientific programme in both EAR1 and EAR2, continuing its 15 year history of measuring high-quality neutron-induced reaction data.
Interest in CERN has evolved over the years. At its inception, the Organization’s founding member states clearly saw the new institution’s potential as a centre of excellence for basic research, a driver of innovation, a provider of first-class education and a catalyst for peace. After several decades of business as usual, CERN is again on the radar of its member-state governments. This is spurred on partly by the public interest that has made CERN something of a household name. But whether in the public spotlight or not, it is incumbent on CERN to spell out to all of its stakeholders why it represents such good value for money today, just as it did 60 years ago. Even though the reasons may be familiar to those working at CERN, they are not always so clear to government officials and policy makers, and it’s worth setting them out in detail.
First and foremost, CERN has made major contributions to how we understand the world that we live in. The discovery and detailed study of weak vector bosons in the 1980s and 1990s, and the recent discovery of the Higgs boson, messenger of the Brout–Englert–Higgs mechanism, have contributed much to our understanding of nature at the level of the fundamental particles and their interactions: now rightfully called the Standard Model of particle physics. This on its own is a major cultural achievement, and it has taught us much about how we have arrived at this point in history: right from the moment it all began, 13.6 billion years ago. Appreciation of this cultural contribution has never been higher than today. More than 100,000 people visit CERN every year, including hundreds of journalists reaching millions of people. None leave CERN unimpressed, and all are, without a doubt, culturally enriched by their experience.
Educating and innovating
CERN’s second major area of impact is education, having educated many generations of top-level physicists, engineers and technicians. Some have remained at CERN, while others have gone on to pursue careers in basic research at universities and institutes elsewhere, therefore contributing to top-level education and multiplying the effect of their own experience at CERN. Many more, however, have made their way into industry, fulfilling an important mission for CERN – that of providing skilled people to advance the economies of our member states and collaborating nations. More than 500 doctoral degrees are awarded annually on the basis of work carried out at CERN experiments and accelerators. In 2015, more than 400 doctoral, technical and administrative students were welcomed by CERN, usually staying for between several months and a year. The CERN summer-student and teacher programmes, which provide short stints of intensive education, also welcome hundreds of students and high-school teachers every year.
A third important contribution of CERN is the innovation that results from research that requires technology at levels and in areas where no one has gone before. The best-known example of CERN technology is the World Wide Web, which has profoundly changed the way that our society works worldwide. But the web is just the tip of the iceberg. Advances in fields such as magnet technology, cryogenics, electronics, detector technology and statistical methods have also made their way into society in ways that are equally impactful, although less obviously evident. While the societal benefits of techniques such as advanced photon and lepton detection may not seem immediately relevant beyond the realms of research, the impact they have had in medical imaging, for instance, is profound.
Often not very visible, but no less effective in contributing to our prosperity and well-being, developments such as this are a vital part of the research cycle. CERN is increasingly taking a proactive approach towards transferring its innovation, knowledge and skills to those who can make these count for society as a whole, and this is generally well appreciated. Recent initiatives include public–private partnerships such as OpenLab, Medipix and IdeaSquare, which provide low-entry-threshold mechanisms for companies to engage with CERN technology. In return, CERN benefits through stimulating the kind of industrial innovation that enables next-generation accelerators and detectors.
The recent Viewpoint by CERN Director-General, Fabiola Gianotti (CERN Courier March 2016 p5) gives a superb outline of the opportunities and challenges for particle physics during the coming years. Clearly it will require great dexterity to juggle the continuation of a state-of-the-art research programme at the LHC and a diverse range of other facilities, with greater engagement with important activities beyond CERN, such as the US neutrino programme, while at the same time preparing for future accelerators and detectors. This will stretch CERN’s capabilities to the limit. But it is precisely this challenge that will motivate the Organization to do better and innovate in all areas, with inevitable benefits for society. Scientific culture and societal impacts advancing hand-in-hand through cutting-edge research: it is this that makes CERN worthy of the support it receives from governments worldwide.
This book provides a comprehensive discussion of quantum confined semiconductor lasers, based on the author’s long and extensive experience in the field. In a pedagogical fashion, it takes the reader from the physics principles and processes exploited by lasers (giving a consistent treatment of both quantum-dot and quantum-well structures) to operation of the most advanced devices.
The text begins with a short historical account of the birth and development of lasers in general (called “maser” at the very beginning because restricted to microwaves), and the diode laser in particular. Thereafter, the book is organised into five sections. The first, dedicated to the diode laser, provides the framework for the whole volume. The second section describes the fundamental processes involved in the physics of lasers, a subject that is then treated in depth in the third part. The fourth section discusses the operation of laser devices and their characteristics (light-current curves, threshold current, efficiency, etc). Finally, the author tackles the important topics of recombination and optical gain, describing ways in which they can be measured on device structures and compared with theoretical predictions.
Full of detailed explanations, illustrations from model calculations and experimental observations, as well as a comprehensive set of exercises, the book is recommended to final-year undergraduate and PhD students, as well as researchers who are new to the field and need a complete overview of the subject.
Numerical relativity is a field of theoretical physics in which Einstein’s equation and associated matter field equations are solved using computer calculations, because they are nonlinear partial-differential equations and therefore they cannot be solved analytically for general problems.
The purpose of this volume is to describe the techniques of numerical relativity and to report the knowledge obtained from the numerical simulations performed so far. The first chapter offers an overview of the basics of general relativity, gravitational waves and relativistic astrophysics, which are the background of numerical relativity. Then, in the first part of the book (chapters 2 to 7), the author discusses the most used formulations and numerical methods, while in the second part (chapters 8 to 11), he reports on representative numerical-relativity simulations and the knowledge derived from them.
Particular importance is given to the results obtained by applying these simulation techniques to the study of black-hole formation, binary compact objects, and the merger of binary neutron stars and black holes. New frontiers in numerical relativity are also touched on in the last two chapters.
Written by Henry Gould’s assistant Jocelyn Quaintance, this book is the result of the deep work and personal relationship between the great mathematician and the author. They met when Quaintance had recently graduated with a PhD, and was looking for a career in research and an advisor who could guide him. He had the luck to collaborate with Gould, who showed him his manuscripts: several handwritten volumes on combinatorial identities. Quaintance offered to edit a text collecting together all of that material, which led to the publication of this book.
The first eight chapters introduce readers to the special techniques that Gould used in proving his binomial identities. This first part is easily accessible to people who have taken basic courses in calculus and discrete mathematics. The second half of the book applies the techniques from the first part, and is particularly relevant for mathematics researchers. It focuses on the connection between various classes of Stirling numbers, and between them and Bernoulli numbers.
Some of the demonstrations presented in the volume represent the only systematic record of Gould’s results. As such, this book is a unique work that could appeal to a wide audience: from graduate students to specialists in enumerative combinatorics, to enthusiasts of Gould’s work.
The challenge of developing more intense, shorter-pulse lasers has already seen outstanding results and opened up completely new perspectives. In fact, the next generation of very-high-power laser facilities will provide the opportunity to explore even ultrarelativistic and vacuum nonlinearity at unprecedented levels, moving towards a QCD regime. At the same time, during the last few years, attosecond physics has provided a new, intriguing way to visualise both atoms and molecules, and the electromagnetic-field structure of the excitation wave packet itself, because this time domain is comparable with the classical periods of electrons orbiting around the nucleus. This growing research field is so recent that the literature on the subject is not yet adequate: in this sense, this book partially fills the gap. It contains contributions from several Chinese groups, both experimental and theoretical, and reports on recent studies of bound electron and molecular nonlinearities. The content is organised over eight chapters and spans a broad range of topics of this specialist subject.
Strong-field tunnelling is a possible key to the ionisation of neutrals. It offers a sophisticated method to image and probe atomic and molecular quantum processes. In fact, the study of direct and rescattered (by the nucleus) electrons in the ionisation process is able to resolve orbitals; in this context, it becomes important to go beyond strong-field approximation, and to evaluate the contribution of the long-range Coulomb field generated by the ion in the electron dynamical evolution (chapter 1).
Direct and rescattered electrons can be recorded together as a reference wave and a signal wave, respectively: the interferential patterns constitute the analogue of optical holography, reconstructing the illuminated objects. It is possible to integrate the influence of the Coulomb field, either in a numerical solution of the time-dependent Schrödinger equation (TDSE) or in a more intuitive quantum-trajectories Monte Carlo method describing the formation mechanisms of the photoelectron angular distribution of above-threshold ionisation (chapter 2).
Dissociation is a basic process of physical chemistry and, before the advent of new ultrafast tools, seemed completely out of scientists’ control, because the typical timescale is below the femtosecond range. For an easier comparison of theoretical predictions and experimental results for a molecule interacting with a strong ultrashort laser pulse, it is necessary to start with the simplest systems – the hydrogen molecular ion H+2. In chapter 3, on the basis of a numerical analysis of the related TDSE, the author suggests a pump–probe strategy to understand dissociation.
The theoretical discussion of double ionisation in a strong laser field is treated in chapters 4 and 5 for different kinds of atoms. In the case of high Z, the experiments show a different degree of correlation of the two expelled electrons, with respect to the low-Z case: this is due to the major importance of rescattering, as described by a semiclassical model. For the simpler systems H2 and He, TDSE is a powerful tool for calculating all of the main features of double ionisation (total and differential cross-sections, recoil-ion momentum spectra, two electron angular distributions, and two electron-interference phenomena).
A promising application of strong-field excitation on atoms and molecules is high-order harmonics generation (HHG), usually providing a XUV comb with different harmonics at the same intensities, both in a single attosecond pulse and in a train of attosecond pulses, by a conversion of the light frequency from IR to the X-ray regime. This technique provides a tomographic image of molecular orbitals as an alternative to scanning tunnelling microscopy or angle-resolved photoelectron spectroscopy, as well as a way to study ultrafast electronic structures, electron dynamics and multichannel dynamics (chapters 6 and 7).
Finally, chapter 8 presents an interesting review of the properties of free electron laser radiation, showing how nuclear motion in photo-induced reactions can be monitored in real time, the electronic dynamics in molecular co-ordinates can be extracted, and the site-specific information in the structural dynamics of chemical reactions can be provided. The experiments are based on EUV pump–probe and optical pump-X-ray probe excitation techniques, and are located at FLASH (Hamburg) and LCLS (SLAC), respectively.
As a summary, the book is a useful update for people who are interested in the specialised field of the interaction of atoms and molecules with femtosecond or sub-femtosecond high-intensity fields. The comprehensive bibliography allows the reader to gain a more exhaustive view of the subject.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.