By Milutin Blagojevićand Friedrich W Hehl (eds.) World Scientific
Hardback: £111 $168 S$222
With a foreword by Tom Kibble and commentaries by Milutin Blagojević and Friedrich W Hehl, the aim of this volume is to introduce graduate and advanced undergraduate students of theoretical or mathematical physics – and other interested researchers – to the field of classical gauge theories of gravity. Intended as a guide to the literature in this field, it encourages readers to study the introductory commentaries and become familiar with the basic content of the reprints and the related ideas, before choosing specific reprints and then returning to the text to focus on further topics.
By Ian D Lawrie CRC Press/Taylor and Francis
Paperback: £44.99
A Unified Grand Tour of Theoretical Physics invites readers on a guided exploration of the theoretical ideas that shape contemporary understanding of the physical world at the fundamental level. Its central themes – which include space–time geometry and the general relativistic account of gravity, quantum field theory and the gauge theories of fundamental forces – are developed in explicit mathematical detail, with an emphasis on conceptual understanding. Straightforward treatments of the Standard Model of particle physics and that of cosmology are supplemented with introductory accounts of more speculative theories, including supersymmetry and string theory. This third edition includes a new chapter on quantum gravity and new sections with extended discussions of topics such as the Higgs boson, massive neutrinos, cosmological perturbations, dark energy and dark matter.
By Anton Rebhan, Ludmil Katzarkov, Johanna Knapp, Radoslav Rashkov and Emanuel Scheidegger (eds.) World Scientific
Hardback: £104
E-book: £135
This book contains invited contributions from collaborators of Maximilian Kreuzer, a well known string theorist who built a sizeable group at Vienna University of Technology (TU Vienna) but sadly died in November 2010 aged just 50. Victor Batyrev, Philip Candelas, Michael Douglas, Alexei Morozov, Joseph Polchinski, Peter van Nieuwenhuizen and Peter Wes are among others giving accounts of Kreuzer’s scientific legacy and original articles. Besides reviews of recent progress in the exploration of string-theory vacua and corresponding mathematical developments, Part I reviews in detail Kreuzer’s important work with Friedemann Brandt and Norbert Dragon on the classification of anomalies in gauge theories. Similarly, Part III contains a user manual for a new thoroughly revised version of PALP (Package for Analysing Lattice Polytopes with applications to toric geometry), the software developed by Kreuzer and Harald Skarke at TU Vienna.
By Alexander W Chao and Weiren Chou (ed.) World Scientific
Hardback: £111
E-book: £144
Of about 30,000 accelerators at work in the world today, a majority of these are for applications in industry. This volume of Reviews of Accelerator Science and Technology contains 14 articles on such applications, all by experts in their respective fields. The first eight articles review various applications, from ion-beam analysis to neutron generation, while the next three discuss accelerator technology that has been developed specifically for industry. The twelfth article tackles the challenging subject of future prospects in this rapidly evolving branch of technology. Last, the volume features an article on the success story of CERN by former director-general, Herwig Schopper, as well as a tribute to Simon van der Meer, “A modest genius of accelerator science”.
The Compact Linear Collider (CLIC) and the International Linear Collider (ILC) – two studies for next-generation projects to complement the LHC – now belong to the same organization. The Linear Collider Collaboration (LCC) was officially launched on 21 February at TRIUMF, Canada’s national laboratory for particle and nuclear physics.
The ILC and CLIC have similar physics goals but use different technologies and are at different stages of readiness. The teams working on them have now united in the new organization to make the best use of the synergies between the two projects and to co-ordinate and advance the global development work for a future linear collider. Lyn Evans, former project leader of the LHC, heads the LCC, while Hitoshi Murayama, director of the Kavli Institute for the Physics and Mathematics of the Universe, is deputy-director.
The LCC has three main sections, reflecting the three areas of research that will continue to be conducted. Mike Harrison of Brookhaven National Laboratory leads the ILC section, Steinar Stapnes of CERN leads the CLIC section and Hitoshi Yamamoto of Tohoku University leads the section for physics and detectors. The Linear Collider Board (LCB), with the University of Tokyo’s Sachio Komamiya at the head, is a new oversight committee for the LCC. Appointed by the International Committee for Future Accelerators, the LCC met for the first time at TRIUMF in February. The ILC’s Global Design Effort and its supervisory organization, the ILC Steering committee, officially handed over their duties to the LCC and LCB in February but they will continue to work together until the official completion of the Technical Design Report for the ILC.
Both the ILC and CLIC will continue to exist and carry on their R&D activities – but with even more synergy between common areas. These include the detectors and the planning of infrastructure, as well as civil-engineering and accelerator aspects. The projects are at different stages of maturity. The CLIC collaboration published its Conceptual Design Report in 2012 and is scheduled to complete the Technical Design Report, which demonstrates feasibility for construction, in a couple of years.
For the ILC collaboration, which will publish its Technical Design Report in June this year, the main focus is on preparing for possible construction while at the same time further advancing acceleration technologies, industrialization and design optimization. The final version of the report will include a new figure for the projected cost. The current estimate is 7.8 thousand million ILC Units (1 ILC unit is equivalent to US$1 of January 2012), plus an explicit estimate for labour costs averaged over the three regional sites, amounting to 23 million person-hours. With the finalization of the Technical Design Report, the ILC’s Global Design Effort, led by Barry Barish, will formally complete its mandate.
With the discovery of the Higgs-like boson at the LHC, the case for a next-generation collider in the near future has received a boost and researchers are thinking of ways to build the linear collider in stages: first as a so-called Higgs factory for the precision studies of the new particle; second at an energy of 500 GeV; and third, at double this energy, to open further possibilities for as yet undiscovered physics phenomena. Japan is signalling interest to host the ILC.
“Now that the LHC has delivered its first and exciting discovery, I am eager to help the next project on its way,” says Evans. “With the strong support the ILC receives from Japan, the LCC may be getting the tunnelling machines out soon for a Higgs factory in Japan while at the same time pushing frontiers in CLIC technology.”
Michel Borghini, who passed away unexpectedly on 15 December 2012, was at CERN for more than 30 years. Born in 1934, Michel was a citizen of Monaco. He graduated from Ecole Polytechnique in 1955 and went on to obtain a degree in electrical engineering from Ecole Supérieure d’Electricité, Paris, in 1957. He then joined the group of Anatole Abragam at what was the Centre d’Etudes Nucléaires, Saclay, where he took part in the study of dynamic nuclear polarization that led to the development of the first polarized proton targets for use in high-energy physics experiments. It was here that he gained the experience that he was to develop at CERN, to the great benefit of experimental particle physics.
The basic aim with a polarized target is to line up the spins of the protons, say, in a given direction. In principle, this can be done by aligning the spins with a magnetic field but the magnitude of the proton’s magnetic moment is such that it takes little energy to knock them out of alignment; thermal vibrations are sufficient. Even at low temperatures and reasonably high magnetic fields, the polarization achieved by this “brute force” method is small: only 0.1% at a temperature of 1 K and in an applied magnetic field of 1 T. To overcome this limitation, dynamic polarization exploits the much larger magnetic moment of electrons by harnessing the coupling of free proton spins in a material with nearby free electron spins. At temperatures of about 1 K, the electron spins are almost fully polarized in an external magnetic field of 1 T and the application of microwaves of around 70 GHz induces resonant transitions between the spin levels of coupled electron–proton pairs. The effect is to increase the natural, small proton polarization by more than two orders of magnitude. The polarization can be reversed with a slight change of the microwave frequency, with no need to reverse the external magnetic field.
First experiments
In 1962, Abragam’s group, including Michel, reported on what was the first experiment to measure the scattering of polarized protons – in this case a 20 MeV beam derived from the cyclotron at Saclay – off a polarized proton target (Abragam et al. 1962). The target was a single crystal of lanthanum magnesium nitrate (La2Mg3(NO3)12.24H2O or LMN), with 0.2% of the La3+ replaced with Ce3+, yielding a proton polarization of 20%.
Michel moved to CERN three years later, where he and others from Saclay and CERN had just tested a polarized target in an experiment on proton–proton scattering at 600 MeV at the Synchrocyclotron (SC). Developed by the Saclay group for the higher energy beams of the Proton Synchrotron (PS), the target consisted of a crystal of LMN 4.5 cm long with transverse dimensions 1.2 cm × 1.2 cm and doped with 1% neodymium. It was cooled to around 1 K in a 4He cryostat built in Saclay by Pierre Roubeau, in the field of a 1.8 T magnet designed by CERN’s Guido Petrucci and built in the SC workshop. This target, with an average polarization of around 70%, was used in several experiments at the PS between 1965 and 1968, in both pion and proton beams with momenta of several GeV/c. These experiments measured the polarization parameter for π± elastic scattering and for the charge-exchange reaction π–p→π0n at small values of t, the square of the four-momentum transfer, typically, |t| < 1 GeV2.
In LMN crystals, the fraction of free, polarized protons is only around 1/16 of the total number of target protons. As a consequence, the unpolarized protons bound in the La, Mg, N and O nuclei formed a serious background in these early experiments. This background was reduced by imposing on the final-state particles the strict kinematic constraints expected from the collisions off protons at rest; the residual background was then subtracted by taking special data with a “dummy” target containing no free protons.
Michel’s group at CERN thus began investigating the possibility of developing polarized targets with a higher content of free protons. In this context, in 1968 Michel published two important papers in which he proposed a new phenomenological model of dynamic nuclear polarization: the “spin-temperature model” (Borghini 1968a and 1968b). The model suggested that sizable proton polarizations could be reached in frozen organic liquids doped with paramagnetic radicals. Despite some initial scepticism, in 1969 Michel’s team succeeded in measuring a polarization of around 40% in a 5 cm3 sample consisting of tiny beads made from a frozen mixture of 95% butanol (C4H9OH) and 5% water saturated with the free-radical porphyrexide. The beads were cooled to 1 K in an external magnetic field of 2.5 T and the fraction of free, polarized protons in the sample was around 1/4 – some four times higher than in LMN (Mango, Runólfsson and Borghini 1969).
The group at CERN went on to study a large number of organic materials doped with free-paramagnetic radicals, searching for the optimum combination for polarized targets. In this activity, where cryostats based on 3He–4He dilution capable of reaching temperatures below 0.1 K were developed, Michel guided two PhD students: Wim de Boer of the University of Delft (now professor at the Karlsruhe Institute of Technology) and Tapio Niinikoski of the University of Helsinki, who went on to join CERN in 1974. They finally obtained polarizations of almost 100% in samples of propanediol (C3H8O2) doped with chromium (V) complexes and cooled to 0.1 K, in a field of 2.5 T, with 19% free, polarized protons.
In this work, the concept of spin temperature that Michel had proposed was verified by polarizing several nuclei simultaneously in a special sample containing 13C and deuterons. The nuclei had different polarizations but their values corresponded to a single spin temperature in the Boltzmann formula giving the populations of the various spin states.
These targets were used in a number of experiments at CERN, at both the PS and the Super Proton Synchrotron (SPS). They measured polarization parameters in the elastic scattering of pions, kaons and protons on protons in the forward diffraction region and at backward scattering angles; in the charge-exchange reactions K–p → K–0n and pp → nn; in the reaction π–p → K0Λ0; and in proton–deuteron scattering. In all of these experiments, Jean-Michel Rieubland of CERN provided invaluable help to ensure a smooth operation of the targets.
In the early 1970s, Michel also initiated the development of “frozen spin” targets. In these targets, the proton spins were first dynamically polarized in a high, uniform magnetic field, and then cooled to a low enough temperature so that the spin-relaxation rate of the protons would be slow even in lower magnetic fields. The targets could then be moved to the detector, thus providing more freedom in the choice of magnetic spectrometers and orientations of the polarization vector. The first frozen spin target was successfully operated at CERN in 1974.
In 1969, Michel took leave from CERN to join the Berkeley group led by Owen Chamberlain working at SLAC, where he took part in a test of T-invariance in inelastic e± scattering from polarized protons in collaboration with the SLAC group led by Richard Taylor. The target, built at Berkeley, was made of butanol and the SLAC 20 GeV spectrometer served as the electron (and positron) analyser. The experiment measured the up–down asymmetry for transverse target spin for both electrons and positrons. No time-reversal violations were seen at the few per cent level.
Michel took leave to work at SLAC again in 1977, this time on a search for parity violation in deep-inelastic scattering of polarized electrons off an unpolarized deuterium target. Here, he worked on the polarized electron source and its associated laser, as well as on the electron spectrometer. The small parity-violation effects expected from the interference of the photon and Z exchanges were, indeed, observed and published in 1978. Michel then moved to the University of Michigan at Ann Arbor, where he joined the group led by Alan Krisch and took part in an experiment to measure proton–proton elastic scattering using both a polarized target and a 6 GeV polarized beam from the 12 GeV Zero Gradient Synchrotron at Argonne National Laboratory.
Michel was an outstanding physicist, equally at ease with theory and being in the laboratory
Michel left CERN’s polarized target group in 1978, succeeded by Niinikoski. Writing in 1985 on major contributions to spin physics, Chamberlain listed the people that he felt to be “the heroes – the people who have given [this] work a special push” (Chamberlain 1985). Michel is the only one that he cites twice: with Abragam and colleagues for the first polarized target and the first experiment to use such a target; and with Niinikoski, for their introduction of the frozen spin target and showing the advantages of powerful (dilution) refrigerators. Today, polarized targets with volumes of several litres and large 3He–4He dilution cryostats are still in operation, for example in the NA58 (COMPASS) experiment at the SPS, where the spin structure of the proton has been studied using deep-inelastic scattering of high-energy muons. Dynamic nuclear polarization has also found applications in medical magnetic-resonance imaging and Michel’s spin-temperature model is still widely used.
In the 1980s, Michel took part in the UA2 experiment at CERN’s SPS proton–antiproton collider, where he contributed to the calibration of the 1444 analogue-to-digital converters (ADCs) that were used to measure the energy deposited in the central calorimeter. He wrote all of the software to drive the large number of precision pulse-generators that monitored the ADC stability during data-taking.
From 1983 to 1996, he was a member of the Executive Committee of the CERN Staff Association, being its vice-president until 1990 and then its president until June 1996. After retiring from CERN in January 1999, he returned to Monaco where in 2003 he was nominated Permanent Representative of Monaco to the United Nations (New York), a post that he kept until 2005.
Michel was an outstanding physicist, equally at ease with theory and being in the laboratory. He had broad professional competences, a sharp, analytical mind, imagination and organizational skills. He is well remembered by his collaborators for his wisdom and advice, and also for his quiet demeanour and his keen but often subtle, sense of humour. His culture and interests extended well beyond science. He was also a talented tennis player. He will be sorely missed by those who had the privilege of working with him, or of being among his friends. Much sympathy goes to his two daughters, Anne and Isabelle, and to their families.
As the field of high-energy physics moves inexorably towards full open access, under the SCOAP3 agreement, it is worth noting a fact that is often overlooked by the scientific community, namely the concomitant affirmation of the role of scientific journals. Indeed, journals will continue to stand – if not for primary dissemination of information, for the continued, independent and, yes, competitive and occasionally controversial quality assessment. An ecosystem of dedicated journals is precisely what this requires, on top of open pre-“print” archives of equally undisputed role.
Yet, looking at the broader landscape of physics and beyond, open access has quite naturally also brought other changes, namely the emergence of “community journals”. By including a variety of kinds of article, these break with the traditional scheme of long established journals, which are typically devoted to single article types, such as letters, regular articles, technical papers or reviews. To promote and foster this development is precisely the aim of The European Physical Journal C – Particles and Fields (EPJC), where all types of publications relevant to the field of high-energy physics (including astroparticle physics and cosmology) are considered.
EPJC has recently seen a series of significant changes in its editorial board. Since January, Jos Engelen (a former research director at CERN) is the new editor-in-chief of the “Experimental Physics” section. He succeeds Siggi Bethke, who successfully co-ordinated this section of the journal until the end of 2012. A few months earlier, the board of theoretical physics editors of EPJC was significantly enlarged. In addition to the traditional board of editors covering the area of “Phenomenology of the Standard Model and Beyond” (now called Theory-I), which Gino Isidori has co-ordinated since the end of 2011, Ignatios Antoniadis (the current head of the Theory Unit at CERN) has taken charge of a largely new board of editors covering the areas of gravitation, astroparticle physics and cosmology, general aspects of quantum field theories and alternatives (Theory II).
The latter developments take into account in particular the ongoing rapid “merger” of accelerator-based particle physics with astroparticle physics and cosmology. The next step for our journal will thus be to reach out to experimental non-accelerator physics to provide a first unified platform as a “community journal” in this further extended sense.
Next to letters, regular articles and reviews (tutorial, specialist, technical, topical scientific meeting summaries), EPJC is particularly keen to develop its “tools” section. Neither theory nor experiment in the traditional sense, this section is a platform for publishing computational, statistical and engineering physics of immediate and close relevance for understanding technical or interpretational aspects of theoretical and experimental particle physics.
Last but not least, there is another aspect by which EPJC wishes to stand out, namely in terms of quality assessment. Taking a lead from Karl Popper, science is a social enterprise and humans react quite differently depending on whether they are solicited personally to comment on quality and relevance or just passively, e.g. as recipients of mailing lists or other automated systems.
At EPJC, three independent levels exist to ensure quality control, each mediated by direct communication as peers in the field: the editors-in-chief, the editorial board and the referees. All of them will have been involved in the assessment and decision-making process for every single paper, ensuring a personal, unbiased and fair implementation of the refereeing process, which remains at the core of the activity of any reputable journal.
SCOAP3
The SCOAP3 consortium (Sponsoring Consortium for Open Access Publishing in Particle Physics), which aims to convert journals in high-energy physics to open access, has chosen two Springer journals to participate in the initiative. They are the Journal of High-Energy Physics, published for the International School for Advanced Studies (SISSA) in Trieste, Italy, and The European Physical Journal C, published with Società Italiana di Fisica. The selection is the result of an open and transparent tender process run by CERN for the benefit of SCOAP3, in which journal quality, price and publishing services were taken into account
Juan José Gómez Cadenas is the director of the Neutrino Physics Group at Valencia University but is best known by the general public as a novelist – in 2008 he wrote Materia Extraña, a scientific thriller (The incurable attraction of physics) – and as an expert in science popularization. Even in a purely scientific environment he is able to deliver information in a most enjoyable way, as I found when I attended a scientific talk that he gave at CERN.
This same ease in communicating is recognizable in El ecologista nuclear, his book about the topic of renewable and green energy and the role of the nuclear energy. I read the Italian edition of the book and although I noticed that the translation was not always perfect and, especially in some cases, that it did not improve the quality of the reading, I really enjoyed the book and its factual approach to this delicate and controversial topic.
Gómez Cadenas makes his point of view clear in the first chapter: “All that glitters is not green.” This could shock the uninitiated because it immediately leads the reader to face “the problem”: climate change is a “bomb (that) has been activated” and humankind is “playing with fire”. The author does not just present this scenario as an opinion. Rather, he justifies all of his statements with graphs, scientific data and evidence.
The chapters that follow are a journey through the various solutions to the problem, in which he makes a strong case for the use of nuclear energy. Using data and graphs, he successfully proves that “safe” nuclear power is the only viable solution. I emphasize the word “safe” because this is the delicate point that matters most to the general public. Unlike other authors, instead of avoiding talking about the problem of safety, Gómez Cadenas discusses it openly, with constant reference to scientific data.
I like the book; I like the author’s open and honest approach, his competence and his rigorous summaries of a global problem that concern us all. I would recommend reading it before voting for any topic related to the energy problem on our planet.
By Thomas Kuhr Springer Hardback: £117 €137.10
Paperback: £109 €119.19
The Tevatron collider operated by Fermilab close to Chicago was – until the LHC at CERN took over – the most powerful particle accelerator on Earth, colliding protons and antiprotons with, finally, a centre-of-mass energy of almost 2 TeV. Among many interesting results, the key discovery was the observation of the top quark by the CDF and DØ collaborations in 1995. In pp– collisions, huge numbers of B and D mesons are also produced, offering sensitive probes for testing the quark-flavour sector of the Standard Model, which is described by the Cabibbo-Kobayashi-Maskawa (CKM) matrix. A closely related topic concerns violation of the charge-parity (CP) symmetry, which can be accommodated through a complex phase in the CKM matrix. Physics beyond the Standard Model may leave footprints in the corresponding observables.
In this branch of particle physics, the key aspect addressed at the upgraded Tevatron (Run-II) was the physics potential of the B0s mesons, which consist of an anti-bottom quark and a strange quark. Since these mesons and their antiparticles were not produced in the e+e– B factories that operated at the Υ(4S) resonance, they fall in the domain of B-physics experiments at hadron colliders, although the Belle experiment could get some access to these particles with the KEK B-factory running at the Υ(5S) resonance. Since the Tevatron stopped operation in autumn 2011, the experimental exploration of the B0s system has been fully conducted at the LHC, with its B-decay experiment LHCb.
The CDF and DØ collaborations did pioneering work in B physics, which culminated in the observation of B0s – B–0s mixing in 2006, first analyses of CP-violating observables provided by the decay B0s → Jψφ around 2008, and intriguing measurements of the dimuon charge asymmetry by DØ in 2010, which probes CP violation in B0s – B0s oscillations.
The author of this book has been a member of the CDF collaboration for many years and gives the reader a guided tour through the flavour-physics landscape at the Tevatron. It starts with historical remarks and then focuses on the quark-flavour sector of the Standard Model with the CKM matrix and the theoretical description of mixing and CP violation, before discussing the Tevatron collider, its detectors and experimental techniques. After these introductory chapters, the author brings the reader in touch with key results, starting with measurements of lifetimes and branching ratios of weak b-hadron decays and their theoretical treatment, followed by a discussion of flavour oscillations, where B0s – B0s mixing is the highlight. An important part of the book deals with various manifestations of CP violation and the corresponding probes offered by the B0s system, where B0s → Jψφ and the dimuon charge-asymmetry are the main actors. Last, rare decays are discussed, putting the spotlight on the B0s → μ+μ– channel, one of the rarest decay processes that nature has to offer. While the book has a strong focus on the B0s system, it also addresses Λb decays and charm physics.
This well written book with its 161 pages is enjoyable to read and offers a fairly compact way to get an overview of the B-physics programme conducted at the Tevatron in the past decade. A reader familiar with basic concepts of particle physics should be able to deal easily with the content. It appears suited to experimental PhD students making first contact with this topic, but experienced researchers from other branches of high-energy physics may also find the book interesting and useful. Topics such as the rare decay B0s → μ+μ–, which has recently appeared as a first 3.5σ signal in the data from LHCb, and measurements of CP violation in B0s decays will continue to be hot topics in the LHC physics programme during this decade, complementing the direct searches for new particles at the ATLAS and CMS detectors.
The LHC at CERN is a prime example of worldwide collaboration to build a large instrument and pursue frontier science. The discovery there of a particle consistent with the long-sought Higgs boson points to future directions both for the LHC and more broadly for particle physics. Now, the international community is considering machines to complement the LHC and further advance particle physics, including the favoured option: an electron–positron linear collider (LC). Two major global efforts are underway: the International Linear Collider (ILC), which is distributed among many laboratories; and the Compact Linear Collider (CLIC), centred at CERN. Both would collide electrons and positrons at tera-electron-volt energies but have different technologies, energy ranges and timescales. Now, the two efforts are coming closer together and forming a worldwide linear-collider community in the areas of accelerators, detectors and resources.
Last year, the organizers of the 2012 IEEE Nuclear Science Symposium held in Anaheim, California, decided to arrange a Special Linear Collider Event to summarize the accelerator and detector concepts for the ILC and CLIC. Held on 29–30 October, the event also included presentations on the impact of LC technologies for different applications and a discussion forum on LC perspectives. It brought together academic, industry and laboratory-based experts, providing an opportunity to discuss LC progress with the accelerator and instrumentation community at large, and to justify the investments in technology required for future particle accelerators and detectors. Representatives of the US Funding Agencies were also invited to attend.
The CLIC studies focus on an option for a multi-tera-electron-volt machine using a novel two-beam acceleration scheme
CERN’s director-general, Rolf Heuer, introduced the event before Steinar Stapnes, CLIC project leader, and Barry Barish, director of the ILC’s Global Design Effort (GDE), reviewed the two projects. The ILC concept is based on superconducting radio-frequency (SRF) cavities, with a nominal accelerating field of 31.5 MV/m, to provide e+e– collisions at sub-tera-electron-volt energies in the centre-of-mass. The CLIC studies focus on an option for a multi-tera-electron-volt machine using a novel two-beam acceleration scheme, with normal-conducting accelerating structures operating at fields as high as 100 MV/m. In this approach, two beams run parallel to each other: the main beam, to be accelerated; and a drive beam, to provide the RF power for the accelerating structures.
Both studies have reached important milestones. The CLIC Conceptual Design Report was released in 2012, with three volumes for physics, detectors and accelerators. The project’s goals for the coming years are well defined, the key challenges being related to system specifications and performance studies for accelerator parts and detectors, technology developments with industry and implementation studies. The aim is to present an implementation plan by 2016, when LHC results at full design energy should become available.
The ILC GDE took a major step towards the final technical design when a draft of the four-volume Technical Design Report (TDR) was presented to the ILC Steering Committee on 15 December 2012 in Tokyo. This describes the successful establishment of the key ILC technologies, as well as advances in the detector R&D and physics studies. Although not released by the time of the NSS meeting, the TDR results served as the basis for the presentations at the special event. The chosen technologies – including SRF cavities with high gradients and state-of-the-art detector concepts – have reached a stage where, should governments decide in favour and a site be chosen, ILC construction could start almost immediately. The ILC TDR, which describes a cost-effective and mature design for an LC in the energy range 200–500 GeV, with a possible upgrade to 1 TeV, is the final deliverable for the GDE mandate.
The newly established Linear Collider Collaboration (LCC), with Lyn Evans as director, will carry out the next steps to integrate the ILC and CLIC efforts under one governance. One highlight in Anaheim was a talk on the physics of the LC by Hitoshi Murayama of the Kavli Institute for Mathematics and Physics of the Universe (IPMU) and future deputy-director for the LCC. He addressed the broader IEEE audience, reviewing how a “Higgs factory” (a 250 GeV machine) as the first phase of the ILC could elucidate the nature of the Higgs particle – complementary to the LHC. The power of the LC lies in its flexibility. It can be tuned to well defined initial states, allowing model-independent measurements from the Higgs threshold to multi-tera-electron-volt energies, as well as precision studies that could reveal new physics at a higher energy scale.
Detailed technical reviews of the ILC and CLIC accelerator concepts and associated technologies followed the opening session. Nick Walker of DESY presented the benefits of using SRF acceleration with a focus on the “globalization” of the technology and the preparation for a worldwide industrial base for the ILC construction. The ultralow cavity-wall losses allow the use of long RF pulses, greatly simplifying the RF source while facilitating efficient acceleration of high-current beams. In addition, the low RF frequency (1.3 GHz) significantly reduces the impedance of the cavities, leading to reduced beam-dynamics effects and relatively relaxed alignment tolerances. More than two decades of R&D have led to a six-fold increase in the available voltage gradient, which – together with integration into a single cryostat (cryomodule) – has resulted in an affordable and mature technology. One of the most important goals of the GDE was to demonstrate that the SRF cavities can be reliably produced in industry. By the end of 2012, two ambitious goals were achieved: to produce cavities qualified at 35 MV/m and to demonstrate that an average gradient of 31.5 MV/m can be reached for ILC cryomodules. These high-gradient cavities have now been produced by industry with 90% yield, acceptable for ILC mass-production.
CERN’s Daniel Schulte reviewed progress with the CLIC concept, which is based on 12 GHz normal conducting accelerating structures and a two-beam scheme (rather than klystrons) for a cost-effective machine. The study has over the past two decades developed high-gradient, micron-precision accelerating structures that now reach more than 100 MV/m, with a breakdown probability of only 3 × 10–7 m–1 pulse–1 during high-power tests and more than 145 MV/m in two-beam acceleration tests at the CTF3 facility at CERN (tolerating higher breakdown rates). The CLIC design is compatible with energy-staging from the ILC baseline design of 0.5 TeV up to 3 TeV. The ILC and CLIC studies are collaborating closely on a number of technical R&D issues: beam delivery and final-focus systems, beam dynamics and simulations, positron generation, damping rings and civil engineering.
Another area of common effort is the development of novel detector technologies
Another area of common effort is the development of novel detector technologies. The ILC and CLIC physics programmes are both based on two complementary detectors with a “push–pull concept” to share the beam time between them. Hitoshi Yamamoto of Tohoku University reviewed the overall detector concepts and engineering challenges: multipurpose detectors for high-precision vertex and main tracking; a highly granular calorimeter inside a large solenoid; and power pulsing of electronics. The two ILC detector concepts (ILD and SiD) formed an excellent starting point for the CLIC studies. They were adapted for the higher-energy CLIC beams and ultra-short (0.5 ns) bunch-spacing by using calorimeters with denser absorbers, modified vertex and forward detector geometries, and precise (a few nanosecond) time-stamping to cope with increased beam-induced background.
Vertex detectors, a key element of the LC physics programme, were presented by Marc Winter of CNRS/IPHC Strasbourg. Their requirements address material budget, granularity and power consumption by calling for new pixel technologies. CCDs with 50 μm final pixels, CMOS pixel sensors (MIMOSA) and depleted field-effect transistor (DEPFET) devices have been developed successfully for many years and are suited to running conditions at the ILC. For CLIC, where much faster read-out is mandatory, the R&D concentrates on multilayer devices, such as vertically integrated 3D sensors comprising interconnected layers thinner than 10 μm, which allow a thin charge-sensitive layer to be combined with several tiers of read-out fabricated in different CMOS processes.
Both ILD and SiD require efficient tracking and highly granular calorimeters, optimized for particle-flow event reconstruction but differ in the central-tracking approach. ILD aims for a large time-projection chamber (TPC) in a 3.5–4 T field, while SiD is designed as a compact, all-silicon tracking detector with a 5 T solenoid. Tim Nelson of SLAC described tracking in the SiD concept with particular emphasis on power, cooling and minimization of material. Takeshi Matsuda of KEK presented progress on the low material-budget field cage for the TPC and end-plates based on micropattern gas detectors (GEMs, Micromegas or InGrid).
Apart from their dimensions, the electromagnetic calorimeters are similar for the ILC and CLIC concepts, as Jean-Claude Brient of Laboratoire Leprince-Ringuet explained. The ILD and the SiD detectors both rely on the silicon-tungsten sampling calorimeters, with emphasis on the separation of close electromagnetic showers. The use of small scintillator strips with silicon photodiodes operated in Geiger mode (SiPM) read-out as an active medium is being considered, as well as mixed designs using alternating layers of silicon and scintillator. José Repond of Argonne National Laboratory described the progress in hadron calorimetry, which is optimized to provide the best possible separation of energy deposits from neutral and charged hadrons. Two main options are under study: small plastic scintillator tiles with embedded SiPMs or higher granularity calorimeters based on gaseous detectors. Simon Kulis of AGH-UST Cracow addressed the importance of precise luminosity measurement at the LC and the challenges of the forward calorimetry.
In the accelerator instrumentation domain, tolerances at CLIC are much tighter because of the higher gradients. Thibaut Lefevre of CERN, Andrea Jeremie of LAPP/CNRS and Daniel Schulte of CERN all discussed beam instrumentation, alignment and module control, including stabilization. Emittance preservation during beam generation, acceleration and focusing are key feasibility issues for achieving high luminosity for CLIC. Extremely small beam sizes of 40 nm (1 nm) in the horizontal (vertical) planes at the interaction point require beam-based alignment down to a few micrometres over several hundred metres and stabilization of the quadrupoles along the linac to nanometres, about an order of magnitude lower than ground vibrations.
Two sessions were specially organized to discuss potential spin-off from LC detector and accelerator technologies. Marcel Demarteau of Argonne National Laboratory summarized a study report, ILC Detector R&D: Its Impact, which points to the value of sustained support for basic R&D for instrumentation. LC detector R&D has already had an impact in particle physics. For example, the DEPFET technology is chosen as a baseline for the Belle-II vertex detector; an adapted version of the MIMOSA CMOS sensor provides the baseline architecture for the upgrade of the inner tracker in ALICE at the LHC; and construction of the TPC for the T2K experiment has benefited from the ILC TPC R&D programme.
The LC hadron calorimetry collaboration (CALICE) has initiated the large-scale use of SiPMs to read out scintillator stacks (8000 channels) and the medical field has already recognized the potential of these powerful imaging calorimeters for proton computed tomography. Erika Garutti of Hamburg University described another medical imaging technique that could benefit from SiPM technology – positron-emission tomography (PET) assisted by time-of-flight (TOF) measurement, with a coincidence-time resolution of about 300 ps FWHM, which is a factor of two better than devices available commercially. Christophe de La Taille of CNRS presented a number of LC detector applications for volcano studies, astrophysics, nuclear physics and medical imaging.
Accelerator R&D is vital for particle physics. A future LC accelerator would use high-technology coupled to nanoscale precision and control on an industrial scale. Marc Ross of SLAC introduced the session, “LC Accelerator Technologies for Industrial Applications”, with an institutional perspective on the future opportunities of LC technologies. The decision in 2004 to develop the ILC based on SRF cavities allowed an unprecedented degree of global focus and participation and a high level of investment, extending the frontiers of this “technology” in terms of performance, reliability and cost.
Particle accelerators are widely used as tools in the service of science with an ever growing number of applications to society
Particle accelerators are widely used as tools in the service of science with an ever growing number of applications to society. An overview of industrial, medical and security-related uses for accelerators was presented by Stuart Henderson of Fermilab. A variety of industrial applications makes use of low-energy beams of electrons, protons and ions (about 20,000 instruments) and some 9000 medical accelerators are in operation in the world. One example of how to improve co-ordination between basic and applied accelerator science is the creation of the Illinois Accelerator Research Centre (IARC). This partnership between the US Department of Energy and the State of Illinois aims to unite industry, universities and Fermilab to advance applications that are directly relevant to society.
SRF technology has potential in a number of industrial applications, as Antony Favale of Advanced Energy Systems explained. For example, large, high-power systems could benefit significantly using SRF, although the costs of cavities and associated cryomodules are higher than for room-temperature linacs; continuous-wave accelerators operating at reasonably high gradients benefit economically and structurally from SRF technology. The industrial markets for SRF accelerators exist for defence, isotope production and accelerator-driven systems for energy production and nuclear-waste mitigation. Walter Wuensch of CERN described how the development of normal-conducting linacs based on the high-gradient 100 MV/m CLIC accelerating structures may be beneficial for a number of accelerator applications, from X-ray free-electron lasers to industrial and medical linacs. Increased performance of high-gradient accelerating structures, translated into lower cost, potentially broadens the market for such accelerators. In addition, industrial applications increasingly require micron-precision 3D geometries, similar to the CLIC prototype accelerating structures. A number of firms have taken steps to extend their capabilities in this area, working closely with the accelerator community.
Steve Lenci of Communications and Power Industries LLC presented an overview of RF technology that supports linear colliders, such as klystrons and power couplers, and discussed the use of similar technologies elsewhere in research and industry. Marc Ross summarized applications of LC instrumentation, used for beam measurements, component monitoring and control and RF feedback.
The Advanced Accelerator Association Promoting Science & Technology (AAA) aims to facilitate industry-government-academia collaboration and to promote and seek industrial applications of advanced technologies derived from R&D on accelerators, with the ILC as a model case. Founded in Japan in 2008, its membership has grown to comprise 90 companies and 38 academic institutions. As the secretary-general Masanori Matsuoka explained, one of the main goals is a study on how to reach a consensus to implement the ILC in Japan and to inform the public of the significance of advanced accelerators and ILC science through social, political and educational events.
The Special Linear Collider Event ended with a forum that brought together directors of the high-energy-physics laboratories and leading experts in LC technologies, from both the academic research sector and industry. A panel discussion, moderated by Brian Foster of the University of Hamburg/DESY, included Rolf Heuer (CERN), Joachim Mnich (DESY), Atsuto Suzuki (KEK), Stuart Henderson (Fermilab), Hitoshi Murayama (IPMU), Steinar Stapnes (CERN) and Akira Yamamoto (KEK) .
The ILC has received considerable recent attention from the Japanese government. The round-table discussion therefore began with Suzuki’s presentation of the discovery of a Higgs-like particle at CERN and the emerging initiative toward hosting an ILC in Japan. The formal government statement, which is expected within the next few years, will provide the opportunity for the early implementation of an ILC and the recent discovery at CERN is strong motivation for a staged approach. This would begin with a 250 GeV machine (a “Higgs factory”), with the possibility of increasing in energy in a longer term. Suzuki also presented the Japan Policy Council’s recommendation, Creation of Global Cities by hosting the ILC, which was published in July 2012.
The discussion then focused on three major issues: the ILC Project Implementation Plan; the ILC Technology Roadmap; and the ILC Added Value to Society. While the possibility of implementing CLIC as a project at CERN to follow the LHC was also on the table, there was less urgency for discussion because the ILC effort counts on an earlier start date. The panellists exchanged many views and opinions with the audience on how the ILC international programme could be financed and how regional priorities could be integrated into a consistent worldwide strategy for the LC. Combining extensive host-lab-based expertise together with resources from individual institutes around the world is a mandatory first step for LC construction, which will also require the development of links between projects, institutions, universities and industry in an ongoing and multifaceted approach.
SRF systems – the central technology for the ILC – have many applications, so a worldwide plan for distributing the mass production of the SRF components is necessary, with technology-transfer proceeding in parallel, in partnership with funding agencies. Another issue discussed related to the model of global collaboration between host/hub-laboratories and industry to build the ILC, where each country shares the costs and human resources. Finally, accelerator and detector developments for the LC have already penetrated many areas of science. The question is how to improve further the transfer of technology from laboratories, so as to develop viable, on-going businesses that serve as a general benefit to society; as in the successful examples, such the IARC facility and the PET-TOF detector, presented in Anaheim.
Last, but not least, this technology-oriented symposium would have been impossible without the tireless efforts of the “Special LC Event” programme committee: Jim Brau, University of Oregon, Juan Fuster (IFIC Valencia), Ingrid-Maria Gregor (DESY Hamburg), Michael Harrison (BNL), Marc Ross (FNAL), Steinar Stapnes (CERN), Maxim Titov (CEA Saclay), Nick Walker (DESY Hamburg), Akira Yamamoto (KEK) and Hitoshi Yamamoto (Tohoku University). In all, this event was considered a real success. More than 90% of participants who answered the conference questionnaire rated it extremely important.
The IEEE tradition
The 2012 Institute of Electrical and Electronics Engineers (IEEE) Nuclear Science Symposium (NSS) and Medical Imaging Conference (MIC) together with the Workshop on Room-Temperature Semiconductor X-Ray and Gamma-Ray Detectors took place at the Disneyland Hotel, Anaheim, California, on 29 October – 3 November. Having received over 850 NSS abstracts – a record number of NSS submissions for the conferences in North America – the 2012 IEEE NSS/MIC Symposium attracted more than 2200 attendees. The NSS series, which started in 1954, offers an outstanding opportunity for detector physicists and other scientists and engineers interested in the fields of nuclear science, radiation detection, accelerators, high-energy physics, astrophysics and related software. During the past decade the symposium has become the largest annual event in the area of nuclear and particle-physics instrumentation, providing an international forum to discuss the science and technology of large-scale experimental facilities at the frontiers of research.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.