Bluefors – leaderboard other pages

Topics

Michel Borghini and the rise of polarized targets

Michel’s 1968 paper

Michel Borghini, who passed away unexpectedly on 15 December 2012, was at CERN for more than 30 years. Born in 1934, Michel was a citizen of Monaco. He graduated from Ecole Polytechnique in 1955 and went on to obtain a degree in electrical engineering from Ecole Supérieure d’Electricité, Paris, in 1957. He then joined the group of Anatole Abragam at what was the Centre d’Etudes Nucléaires, Saclay, where he took part in the study of dynamic nuclear polarization that led to the development of the first polarized proton targets for use in high-energy physics experiments. It was here that he gained the experience that he was to develop at CERN, to the great benefit of experimental particle physics.

The basic aim with a polarized target is to line up the spins of the protons, say, in a given direction. In principle, this can be done by aligning the spins with a magnetic field but the magnitude of the proton’s magnetic moment is such that it takes little energy to knock them out of alignment; thermal vibrations are sufficient. Even at low temperatures and reasonably high magnetic fields, the polarization achieved by this “brute force” method is small: only 0.1% at a temperature of 1 K and in an applied magnetic field of 1 T. To overcome this limitation, dynamic polarization exploits the much larger magnetic moment of electrons by harnessing the coupling of free proton spins in a material with nearby free electron spins. At temperatures of about 1 K, the electron spins are almost fully polarized in an external magnetic field of 1 T and the application of microwaves of around 70 GHz induces resonant transitions between the spin levels of coupled electron–proton pairs. The effect is to increase the natural, small proton polarization by more than two orders of magnitude. The polarization can be reversed with a slight change of the microwave frequency, with no need to reverse the external magnetic field.

First experiments

In 1962, Abragam’s group, including Michel, reported on what was the first experiment to measure the scattering of polarized protons – in this case a 20 MeV beam derived from the cyclotron at Saclay – off a polarized proton target (Abragam et al. 1962). The target was a single crystal of lanthanum magnesium nitrate (La2Mg3(NO3)12.24H2O or LMN), with 0.2% of the La3+ replaced with Ce3+, yielding a proton polarization of 20%.

Michel with a polarized target

Michel moved to CERN three years later, where he and others from Saclay and CERN had just tested a polarized target in an experiment on proton–proton scattering at 600 MeV at the Synchrocyclotron (SC). Developed by the Saclay group for the higher energy beams of the Proton Synchrotron (PS), the target consisted of a crystal of LMN 4.5 cm long with transverse dimensions 1.2 cm × 1.2 cm and doped with 1% neodymium. It was cooled to around 1 K in a 4He cryostat built in Saclay by Pierre Roubeau, in the field of a 1.8 T magnet designed by CERN’s Guido Petrucci and built in the SC workshop. This target, with an average polarization of around 70%, was used in several experiments at the PS between 1965 and 1968, in both pion and proton beams with momenta of several GeV/c. These experiments measured the polarization parameter for π± elastic scattering and for the charge-exchange reaction πp→π0n at small values of t, the square of the four-momentum transfer, typically, |t| < 1 GeV2.

In LMN crystals, the fraction of free, polarized protons is only around 1/16 of the total number of target protons. As a consequence, the unpolarized protons bound in the La, Mg, N and O nuclei formed a serious background in these early experiments. This background was reduced by imposing on the final-state particles the strict kinematic constraints expected from the collisions off protons at rest; the residual background was then subtracted by taking special data with a “dummy” target containing no free protons.

Michel’s group at CERN thus began investigating the possibility of developing polarized targets with a higher content of free protons. In this context, in 1968 Michel published two important papers in which he proposed a new phenomenological model of dynamic nuclear polarization: the “spin-temperature model” (Borghini 1968a and 1968b). The model suggested that sizable proton polarizations could be reached in frozen organic liquids doped with paramagnetic radicals. Despite some initial scepticism, in 1969 Michel’s team succeeded in measuring a polarization of around 40% in a 5 cm3 sample consisting of tiny beads made from a frozen mixture of 95% butanol (C4H9OH) and 5% water saturated with the free-radical porphyrexide. The beads were cooled to 1 K in an external magnetic field of 2.5 T and the fraction of free, polarized protons in the sample was around 1/4 – some four times higher than in LMN (Mango, Runólfsson and Borghini 1969).

The group at CERN went on to study a large number of organic materials doped with free-paramagnetic radicals, searching for the optimum combination for polarized targets. In this activity, where cryostats based on 3He–4He dilution capable of reaching temperatures below 0.1 K were developed, Michel guided two PhD students: Wim de Boer of the University of Delft (now professor at the Karlsruhe Institute of Technology) and Tapio Niinikoski of the University of Helsinki, who went on to join CERN in 1974. They finally obtained polarizations of almost 100% in samples of propanediol (C3H8O2) doped with chromium (V) complexes and cooled to 0.1 K, in a field of 2.5 T, with 19% free, polarized protons.

In this work, the concept of spin temperature that Michel had proposed was verified by polarizing several nuclei simultaneously in a special sample containing 13C and deuterons. The nuclei had different polarizations but their values corresponded to a single spin temperature in the Boltzmann formula giving the populations of the various spin states.

These targets were used in a number of experiments at CERN, at both the PS and the Super Proton Synchrotron (SPS). They measured polarization parameters in the elastic scattering of pions, kaons and protons on protons in the forward diffraction region and at backward scattering angles; in the charge-exchange reactions Kp → K–0n and pp → nn; in the reaction πp → K0Λ0; and in proton–deuteron scattering. In all of these experiments, Jean-Michel Rieubland of CERN provided invaluable help to ensure a smooth operation of the targets.

Michel

In the early 1970s, Michel also initiated the development of “frozen spin” targets. In these targets, the proton spins were first dynamically polarized in a high, uniform magnetic field, and then cooled to a low enough temperature so that the spin-relaxation rate of the protons would be slow even in lower magnetic fields. The targets could then be moved to the detector, thus providing more freedom in the choice of magnetic spectrometers and orientations of the polarization vector. The first frozen spin target was successfully operated at CERN in 1974.

In 1969, Michel took leave from CERN to join the Berkeley group led by Owen Chamberlain working at SLAC, where he took part in a test of T-invariance in inelastic e± scattering from polarized protons in collaboration with the SLAC group led by Richard Taylor. The target, built at Berkeley, was made of butanol and the SLAC 20 GeV spectrometer served as the electron (and positron) analyser. The experiment measured the up–down asymmetry for transverse target spin for both electrons and positrons. No time-reversal violations were seen at the few per cent level.

Michel took leave to work at SLAC again in 1977, this time on a search for parity violation in deep-inelastic scattering of polarized electrons off an unpolarized deuterium target. Here, he worked on the polarized electron source and its associated laser, as well as on the electron spectrometer. The small parity-violation effects expected from the interference of the photon and Z exchanges were, indeed, observed and published in 1978. Michel then moved to the University of Michigan at Ann Arbor, where he joined the group led by Alan Krisch and took part in an experiment to measure proton–proton elastic scattering using both a polarized target and a 6 GeV polarized beam from the 12 GeV Zero Gradient Synchrotron at Argonne National Laboratory.

Michel was an outstanding physicist, equally at ease with theory and being in the laboratory

Michel left CERN’s polarized target group in 1978, succeeded by Niinikoski. Writing in 1985 on major contributions to spin physics, Chamberlain listed the people that he felt to be “the heroes – the people who have given [this] work a special push” (Chamberlain 1985). Michel is the only one that he cites twice: with Abragam and colleagues for the first polarized target and the first experiment to use such a target; and with Niinikoski, for their introduction of the frozen spin target and showing the advantages of powerful (dilution) refrigerators. Today, polarized targets with volumes of several litres and large 3He–4He dilution cryostats are still in operation, for example in the NA58 (COMPASS) experiment at the SPS, where the spin structure of the proton has been studied using deep-inelastic scattering of high-energy muons. Dynamic nuclear polarization has also found applications in medical magnetic-resonance imaging and Michel’s spin-temperature model is still widely used.

In the 1980s, Michel took part in the UA2 experiment at CERN’s SPS proton–antiproton collider, where he contributed to the calibration of the 1444 analogue-to-digital converters (ADCs) that were used to measure the energy deposited in the central calorimeter. He wrote all of the software to drive the large number of precision pulse-generators that monitored the ADC stability during data-taking.

From 1983 to 1996, he was a member of the Executive Committee of the CERN Staff Association, being its vice-president until 1990 and then its president until June 1996. After retiring from CERN in January 1999, he returned to Monaco where in 2003 he was nominated Permanent Representative of Monaco to the United Nations (New York), a post that he kept until 2005.

Michel was an outstanding physicist, equally at ease with theory and being in the laboratory. He had broad professional competences, a sharp, analytical mind, imagination and organizational skills. He is well remembered by his collaborators for his wisdom and advice, and also for his quiet demeanour and his keen but often subtle, sense of humour. His culture and interests extended well beyond science. He was also a talented tennis player. He will be sorely missed by those who had the privilege of working with him, or of being among his friends. Much sympathy goes to his two daughters, Anne and Isabelle, and to their families.

The changing world of EPJC

CCvie1_03_13

As the field of high-energy physics moves inexorably towards full open access, under the SCOAP3 agreement, it is worth noting a fact that is often overlooked by the scientific community, namely the concomitant affirmation of the role of scientific journals. Indeed, journals will continue to stand – if not for primary dissemination of information, for the continued, independent and, yes, competitive and occasionally controversial quality assessment. An ecosystem of dedicated journals is precisely what this requires, on top of open pre-“print” archives of equally undisputed role.

Yet, looking at the broader landscape of physics and beyond, open access has quite naturally also brought other changes, namely the emergence of “community journals”. By including a variety of kinds of article, these break with the traditional scheme of long established journals, which are typically devoted to single article types, such as letters, regular articles, technical papers or reviews. To promote and foster this development is precisely the aim of The European Physical Journal C – Particles and Fields (EPJC), where all types of publications relevant to the field of high-energy physics (including astroparticle physics and cosmology) are considered.

EPJC has recently seen a series of significant changes in its editorial board. Since January, Jos Engelen (a former research director at CERN) is the new editor-in-chief of the “Experimental Physics” section. He succeeds Siggi Bethke, who successfully co-ordinated this section of the journal until the end of 2012. A few months earlier, the board of theoretical physics editors of EPJC was significantly enlarged. In addition to the traditional board of editors covering the area of “Phenomenology of the Standard Model and Beyond” (now called Theory-I), which Gino Isidori has co-ordinated since the end of 2011, Ignatios Antoniadis (the current head of the Theory Unit at CERN) has taken charge of a largely new board of editors covering the areas of gravitation, astroparticle physics and cosmology, general aspects of quantum field theories and alternatives (Theory II).

The latter developments take into account in particular the ongoing rapid “merger” of accelerator-based particle physics with astroparticle physics and cosmology. The next step for our journal will thus be to reach out to experimental non-accelerator physics to provide a first unified platform as a “community journal” in this further extended sense.

Next to letters, regular articles and reviews (tutorial, specialist, technical, topical scientific meeting summaries), EPJC is particularly keen to develop its “tools” section. Neither theory nor experiment in the traditional sense, this section is a platform for publishing computational, statistical and engineering physics of immediate and close relevance for understanding technical or interpretational aspects of theoretical and experimental particle physics.

Last but not least, there is another aspect by which EPJC wishes to stand out, namely in terms of quality assessment. Taking a lead from Karl Popper, science is a social enterprise and humans react quite differently depending on whether they are solicited personally to comment on quality and relevance or just passively, e.g. as recipients of mailing lists or other automated systems.

At EPJC, three independent levels exist to ensure quality control, each mediated by direct communication as peers in the field: the editors-in-chief, the editorial board and the referees. All of them will have been involved in the assessment and decision-making process for every single paper, ensuring a personal, unbiased and fair implementation of the refereeing process, which remains at the core of the activity of any reputable journal.

SCOAP3

The SCOAP3 consortium (Sponsoring Consortium for Open Access Publishing in Particle Physics), which aims to convert journals in high-energy physics to open access, has chosen two Springer journals to participate in the initiative. They are the Journal of High-Energy Physics, published for the International School for Advanced Studies (SISSA) in Trieste, Italy, and The European Physical Journal C, published with Società Italiana di Fisica. The selection is the result of an open and transparent tender process run by CERN for the benefit of SCOAP3, in which journal quality, price and publishing services were taken into account

El ecologista nuclear

By Juan José Gómez Cadenas Espasa Calpe
Paperback: €22.95

Also published as:
L’ambientalista nucleare
Springer
Paperback: €25

E-book: €22.99
The Nuclear Environmentalist
Springer
Paperback: £24.99 €29.07
E-book: £25.99 €26.99

Juan José Gómez Cadenas is the director of the Neutrino Physics Group at Valencia University but is best known by the general public as a novelist – in 2008 he wrote Materia Extraña, a scientific thriller (The incurable attraction of physics) – and as an expert in science popularization. Even in a purely scientific environment he is able to deliver information in a most enjoyable way, as I found when I attended a scientific talk that he gave at CERN.

CCboo2_01_13

This same ease in communicating is recognizable in El ecologista nuclear, his book about the topic of renewable and green energy and the role of the nuclear energy. I read the Italian edition of the book and although I noticed that the translation was not always perfect and, especially in some cases, that it did not improve the quality of the reading, I really enjoyed the book and its factual approach to this delicate and controversial topic.

Gómez Cadenas makes his point of view clear in the first chapter: “All that glitters is not green.” This could shock the uninitiated because it immediately leads the reader to face “the problem”: climate change is a “bomb (that) has been activated” and humankind is “playing with fire”. The author does not just present this scenario as an opinion. Rather, he justifies all of his statements with graphs, scientific data and evidence.

The chapters that follow are a journey through the various solutions to the problem, in which he makes a strong case for the use of nuclear energy. Using data and graphs, he successfully proves that “safe” nuclear power is the only viable solution. I emphasize the word “safe” because this is the delicate point that matters most to the general public. Unlike other authors, instead of avoiding talking about the problem of safety, Gómez Cadenas discusses it openly, with constant reference to scientific data.

I like the book; I like the author’s open and honest approach, his competence and his rigorous summaries of a global problem that concern us all. I would recommend reading it before voting for any topic related to the energy problem on our planet.

Flavor Physics at the Tevatron: Decay, Mixing and CP Violation Measurements in proton-antiproton Collisions

By Thomas Kuhr
Springer Hardback: £117 €137.10
Paperback: £109 €119.19

The Tevatron collider operated by Fermilab close to Chicago was – until the LHC at CERN took over – the most powerful particle accelerator on Earth, colliding protons and antiprotons with, finally, a centre-of-mass energy of almost 2 TeV. Among many interesting results, the key discovery was the observation of the top quark by the CDF and DØ collaborations in 1995. In pp– collisions, huge numbers of B and D mesons are also produced, offering sensitive probes for testing the quark-flavour sector of the Standard Model, which is described by the Cabibbo-Kobayashi-Maskawa (CKM) matrix. A closely related topic concerns violation of the charge-parity (CP) symmetry, which can be accommodated through a complex phase in the CKM matrix. Physics beyond the Standard Model may leave footprints in the corresponding observables.

CCboo1_01_13

In this branch of particle physics, the key aspect addressed at the upgraded Tevatron (Run-II) was the physics potential of the B0s mesons, which consist of an anti-bottom quark and a strange quark. Since these mesons and their antiparticles were not produced in the e+e B factories that operated at the Υ(4S) resonance, they fall in the domain of B-physics experiments at hadron colliders, although the Belle experiment could get some access to these particles with the KEK B-factory running at the Υ(5S) resonance. Since the Tevatron stopped operation in autumn 2011, the experimental exploration of the B0s system has been fully conducted at the LHC, with its B-decay experiment LHCb.

The CDF and DØ collaborations did pioneering work in B physics, which culminated in the observation of B0s – B–0s mixing in 2006, first analyses of CP-violating observables provided by the decay B0s → Jψφ around 2008, and intriguing measurements of the dimuon charge asymmetry by DØ in 2010, which probes CP violation in B0s –  B0s oscillations.

The author of this book has been a member of the CDF collaboration for many years and gives the reader a guided tour through the flavour-physics landscape at the Tevatron. It starts with historical remarks and then focuses on the quark-flavour sector of the Standard Model with the CKM matrix and the theoretical description of mixing and CP violation, before discussing the Tevatron collider, its detectors and experimental techniques. After these introductory chapters, the author brings the reader in touch with key results, starting with measurements of lifetimes and branching ratios of weak b-hadron decays and their theoretical treatment, followed by a discussion of flavour oscillations, where B0s – B0s mixing is the highlight. An important part of the book deals with various manifestations of CP violation and the corresponding probes offered by the B0s system, where B0s → Jψφ and the dimuon charge-asymmetry are the main actors. Last, rare decays are discussed, putting the spotlight on the B0s → μ+μ– channel, one of the rarest decay processes that nature has to offer. While the book has a strong focus on the B0s system, it also addresses Λb decays and charm physics.

This well written book with its 161 pages is enjoyable to read and offers a fairly compact way to get an overview of the B-physics programme conducted at the Tevatron in the past decade. A reader familiar with basic concepts of particle physics should be able to deal easily with the content. It appears suited to experimental PhD students making first contact with this topic, but experienced researchers from other branches of high-energy physics may also find the book interesting and useful. Topics such as the rare decay B0s → μ+μ, which has recently appeared as a first 3.5σ signal in the data from LHCb, and measurements of CP violation in B0s decays will continue to be hot topics in the LHC physics programme during this decade, complementing the direct searches for new particles at the ATLAS and CMS detectors.

Linear-collider technologies for all

A superconducting RF accelerator

The LHC at CERN is a prime example of worldwide collaboration to build a large instrument and pursue frontier science. The discovery there of a particle consistent with the long-sought Higgs boson points to future directions both for the LHC and more broadly for particle physics. Now, the international community is considering machines to complement the LHC and further advance particle physics, including the favoured option: an electron–positron linear collider (LC). Two major global efforts are underway: the International Linear Collider (ILC), which is distributed among many laboratories; and the Compact Linear Collider (CLIC), centred at CERN. Both would collide electrons and positrons at tera-electron-volt energies but have different technologies, energy ranges and timescales. Now, the two efforts are coming closer together and forming a worldwide linear-collider community in the areas of accelerators, detectors and resources.

Last year, the organizers of the 2012 IEEE Nuclear Science Symposium held in Anaheim, California, decided to arrange a Special Linear Collider Event to summarize the accelerator and detector concepts for the ILC and CLIC. Held on 29–30 October, the event also included presentations on the impact of LC technologies for different applications and a discussion forum on LC perspectives. It brought together academic, industry and laboratory-based experts, providing an opportunity to discuss LC progress with the accelerator and instrumentation community at large, and to justify the investments in technology required for future particle accelerators and detectors. Representatives of the US Funding Agencies were also invited to attend.

The CLIC studies focus on an option for a multi-tera-electron-volt machine using a novel two-beam acceleration scheme

CERN’s director-general, Rolf Heuer, introduced the event before Steinar Stapnes, CLIC project leader, and Barry Barish, director of the ILC’s Global Design Effort (GDE), reviewed the two projects. The ILC concept is based on superconducting radio-frequency (SRF) cavities, with a nominal accelerating field of 31.5 MV/m, to provide e+e collisions at sub-tera-electron-volt energies in the centre-of-mass. The CLIC studies focus on an option for a multi-tera-electron-volt machine using a novel two-beam acceleration scheme, with normal-conducting accelerating structures operating at fields as high as 100 MV/m. In this approach, two beams run parallel to each other: the main beam, to be accelerated; and a drive beam, to provide the RF power for the accelerating structures.

Both studies have reached important milestones. The CLIC Conceptual Design Report was released in 2012, with three volumes for physics, detectors and accelerators. The project’s goals for the coming years are well defined, the key challenges being related to system specifications and performance studies for accelerator parts and detectors, technology developments with industry and implementation studies. The aim is to present an implementation plan by 2016, when LHC results at full design energy should become available.

The ILC GDE took a major step towards the final technical design when a draft of the four-volume Technical Design Report (TDR) was presented to the ILC Steering Committee on 15 December 2012 in Tokyo. This describes the successful establishment of the key ILC technologies, as well as advances in the detector R&D and physics studies. Although not released by the time of the NSS meeting, the TDR results served as the basis for the presentations at the special event. The chosen technologies – including SRF cavities with high gradients and state-of-the-art detector concepts – have reached a stage where, should governments decide in favour and a site be chosen, ILC construction could start almost immediately. The ILC TDR, which describes a cost-effective and mature design for an LC in the energy range 200–500 GeV, with a possible upgrade to 1 TeV, is the final deliverable for the GDE mandate.

The newly established Linear Collider Collaboration (LCC), with Lyn Evans as director, will carry out the next steps to integrate the ILC and CLIC efforts under one governance. One highlight in Anaheim was a talk on the physics of the LC by Hitoshi Murayama of the Kavli Institute for Mathematics and Physics of the Universe (IPMU) and future deputy-director for the LCC. He addressed the broader IEEE audience, reviewing how a “Higgs factory” (a 250 GeV machine) as the first phase of the ILC could elucidate the nature of the Higgs particle – complementary to the LHC. The power of the LC lies in its flexibility. It can be tuned to well defined initial states, allowing model-independent measurements from the Higgs threshold to multi-tera-electron-volt energies, as well as precision studies that could reveal new physics at a higher energy scale.

Detailed technical reviews of the ILC and CLIC accelerator concepts and associated technologies followed the opening session. Nick Walker of DESY presented the benefits of using SRF acceleration with a focus on the “globalization” of the technology and the preparation for a worldwide industrial base for the ILC construction. The ultralow cavity-wall losses allow the use of long RF pulses, greatly simplifying the RF source while facilitating efficient acceleration of high-current beams. In addition, the low RF frequency (1.3 GHz) significantly reduces the impedance of the cavities, leading to reduced beam-dynamics effects and relatively relaxed alignment tolerances. More than two decades of R&D have led to a six-fold increase in the available voltage gradient, which – together with integration into a single cryostat (cryomodule) – has resulted in an affordable and mature technology. One of the most important goals of the GDE was to demonstrate that the SRF cavities can be reliably produced in industry. By the end of 2012, two ambitious goals were achieved: to produce cavities qualified at 35 MV/m and to demonstrate that an average gradient of 31.5 MV/m can be reached for ILC cryomodules. These high-gradient cavities have now been produced by industry with 90% yield, acceptable for ILC mass-production.

CERN’s Daniel Schulte reviewed progress with the CLIC concept, which is based on 12 GHz normal conducting accelerating structures and a two-beam scheme (rather than klystrons) for a cost-effective machine. The study has over the past two decades developed high-gradient, micron-precision accelerating structures that now reach more than 100 MV/m, with a breakdown probability of only 3 × 10–7 m–1 pulse–1 during high-power tests and more than 145 MV/m in two-beam acceleration tests at the CTF3 facility at CERN (tolerating higher breakdown rates). The CLIC design is compatible with energy-staging from the ILC baseline design of 0.5 TeV up to 3 TeV. The ILC and CLIC studies are collaborating closely on a number of technical R&D issues: beam delivery and final-focus systems, beam dynamics and simulations, positron generation, damping rings and civil engineering.

Another area of common effort is the development of novel detector technologies

Another area of common effort is the development of novel detector technologies. The ILC and CLIC physics programmes are both based on two complementary detectors with a “push–pull concept” to share the beam time between them. Hitoshi Yamamoto of Tohoku University reviewed the overall detector concepts and engineering challenges: multipurpose detectors for high-precision vertex and main tracking; a highly granular calorimeter inside a large solenoid; and power pulsing of electronics. The two ILC detector concepts (ILD and SiD) formed an excellent starting point for the CLIC studies. They were adapted for the higher-energy CLIC beams and ultra-short (0.5 ns) bunch-spacing by using calorimeters with denser absorbers, modified vertex and forward detector geometries, and precise (a few nanosecond) time-stamping to cope with increased beam-induced background.

Vertex detectors, a key element of the LC physics programme, were presented by Marc Winter of CNRS/IPHC Strasbourg. Their requirements address material budget, granularity and power consumption by calling for new pixel technologies. CCDs with 50 μm final pixels, CMOS pixel sensors (MIMOSA) and depleted field-effect transistor (DEPFET) devices have been developed successfully for many years and are suited to running conditions at the ILC. For CLIC, where much faster read-out is mandatory, the R&D concentrates on multilayer devices, such as vertically integrated 3D sensors comprising interconnected layers thinner than 10 μm, which allow a thin charge-sensitive layer to be combined with several tiers of read-out fabricated in different CMOS processes.

Both ILD and SiD require efficient tracking and highly granular calorimeters, optimized for particle-flow event reconstruction but differ in the central-tracking approach. ILD aims for a large time-projection chamber (TPC) in a 3.5–4 T field, while SiD is designed as a compact, all-silicon tracking detector with a 5 T solenoid. Tim Nelson of SLAC described tracking in the SiD concept with particular emphasis on power, cooling and minimization of material. Takeshi Matsuda of KEK presented progress on the low material-budget field cage for the TPC and end-plates based on micropattern gas detectors (GEMs, Micromegas or InGrid).

Apart from their dimensions, the electromagnetic calorimeters are similar for the ILC and CLIC concepts, as Jean-Claude Brient of Laboratoire Leprince-Ringuet explained. The ILD and the SiD detectors both rely on the silicon-tungsten sampling calorimeters, with emphasis on the separation of close electromagnetic showers. The use of small scintillator strips with silicon photodiodes operated in Geiger mode (SiPM) read-out as an active medium is being considered, as well as mixed designs using alternating layers of silicon and scintillator. José Repond of Argonne National Laboratory described the progress in hadron calorimetry, which is optimized to provide the best possible separation of energy deposits from neutral and charged hadrons. Two main options are under study: small plastic scintillator tiles with embedded SiPMs or higher granularity calorimeters based on gaseous detectors. Simon Kulis of AGH-UST Cracow addressed the importance of precise luminosity measurement at the LC and the challenges of the forward calorimetry.

The discussion forum

In the accelerator instrumentation domain, tolerances at CLIC are much tighter because of the higher gradients. Thibaut Lefevre of CERN, Andrea Jeremie of LAPP/CNRS and Daniel Schulte of CERN all discussed beam instrumentation, alignment and module control, including stabilization. Emittance preservation during beam generation, acceleration and focusing are key feasibility issues for achieving high luminosity for CLIC. Extremely small beam sizes of 40 nm (1 nm) in the horizontal (vertical) planes at the interaction point require beam-based alignment down to a few micrometres over several hundred metres and stabilization of the quadrupoles along the linac to nanometres, about an order of magnitude lower than ground vibrations.

Two sessions were specially organized to discuss potential spin-off from LC detector and accelerator technologies. Marcel Demarteau of Argonne National Laboratory summarized a study report, ILC Detector R&D: Its Impact, which points to the value of sustained support for basic R&D for instrumentation. LC detector R&D has already had an impact in particle physics. For example, the DEPFET technology is chosen as a baseline for the Belle-II vertex detector; an adapted version of the MIMOSA CMOS sensor provides the baseline architecture for the upgrade of the inner tracker in ALICE at the LHC; and construction of the TPC for the T2K experiment has benefited from the ILC TPC R&D programme.

The LC hadron calorimetry collaboration (CALICE) has initiated the large-scale use of SiPMs to read out scintillator stacks (8000 channels) and the medical field has already recognized the potential of these powerful imaging calorimeters for proton computed tomography. Erika Garutti of Hamburg University described another medical imaging technique that could benefit from SiPM technology – positron-emission tomography (PET) assisted by time-of-flight (TOF) measurement, with a coincidence-time resolution of about 300 ps FWHM, which is a factor of two better than devices available commercially. Christophe de La Taille of CNRS presented a number of LC detector applications for volcano studies, astrophysics, nuclear physics and medical imaging.

Accelerator R&D is vital for particle physics. A future LC accelerator would use high-technology coupled to nanoscale precision and control on an industrial scale. Marc Ross of SLAC introduced the session, “LC Accelerator Technologies for Industrial Applications”, with an institutional perspective on the future opportunities of LC technologies. The decision in 2004 to develop the ILC based on SRF cavities allowed an unprecedented degree of global focus and participation and a high level of investment, extending the frontiers of this “technology” in terms of performance, reliability and cost.

Particle accelerators are widely used as tools in the service of science with an ever growing number of applications to society

Particle accelerators are widely used as tools in the service of science with an ever growing number of applications to society. An overview of industrial, medical and security-related uses for accelerators was presented by Stuart Henderson of Fermilab. A variety of industrial applications makes use of low-energy beams of electrons, protons and ions (about 20,000 instruments) and some 9000 medical accelerators are in operation in the world. One example of how to improve co-ordination between basic and applied accelerator science is the creation of the Illinois Accelerator Research Centre (IARC). This partnership between the US Department of Energy and the State of Illinois aims to unite industry, universities and Fermilab to advance applications that are directly relevant to society.

SRF technology has potential in a number of industrial applications, as Antony Favale of Advanced Energy Systems explained. For example, large, high-power systems could benefit significantly using SRF, although the costs of cavities and associated cryomodules are higher than for room-temperature linacs; continuous-wave accelerators operating at reasonably high gradients benefit economically and structurally from SRF technology. The industrial markets for SRF accelerators exist for defence, isotope production and accelerator-driven systems for energy production and nuclear-waste mitigation. Walter Wuensch of CERN described how the development of normal-conducting linacs based on the high-gradient 100 MV/m CLIC accelerating structures may be beneficial for a number of accelerator applications, from X-ray free-electron lasers to industrial and medical linacs. Increased performance of high-gradient accelerating structures, translated into lower cost, potentially broadens the market for such accelerators. In addition, industrial applications increasingly require micron-precision 3D geometries, similar to the CLIC prototype accelerating structures. A number of firms have taken steps to extend their capabilities in this area, working closely with the accelerator community.

Steve Lenci of Communications and Power Industries LLC presented an overview of RF technology that supports linear colliders, such as klystrons and power couplers, and discussed the use of similar technologies elsewhere in research and industry. Marc Ross summarized applications of LC instrumentation, used for beam measurements, component monitoring and control and RF feedback.

The Advanced Accelerator Association Promoting Science & Technology (AAA) aims to facilitate industry-government-academia collaboration and to promote and seek industrial applications of advanced technologies derived from R&D on accelerators, with the ILC as a model case. Founded in Japan in 2008, its membership has grown to comprise 90 companies and 38 academic institutions. As the secretary-general Masanori Matsuoka explained, one of the main goals is a study on how to reach a consensus to implement the ILC in Japan and to inform the public of the significance of advanced accelerators and ILC science through social, political and educational events.

The Special Linear Collider Event ended with a forum that brought together directors of the high-energy-physics laboratories and leading experts in LC technologies, from both the academic research sector and industry. A panel discussion, moderated by Brian Foster of the University of Hamburg/DESY, included Rolf Heuer (CERN), Joachim Mnich (DESY), Atsuto Suzuki (KEK), Stuart Henderson (Fermilab), Hitoshi Murayama (IPMU), Steinar Stapnes (CERN) and Akira Yamamoto (KEK) .

The ILC has received considerable recent attention from the Japanese government. The round-table discussion therefore began with Suzuki’s presentation of the discovery of a Higgs-like particle at CERN and the emerging initiative toward hosting an ILC in Japan. The formal government statement, which is expected within the next few years, will provide the opportunity for the early implementation of an ILC and the recent discovery at CERN is strong motivation for a staged approach. This would begin with a 250 GeV machine (a “Higgs factory”), with the possibility of increasing in energy in a longer term. Suzuki also presented the Japan Policy Council’s recommendation, Creation of Global Cities by hosting the ILC, which was published in July 2012.

The discussion then focused on three major issues: the ILC Project Implementation Plan; the ILC Technology Roadmap; and the ILC Added Value to Society. While the possibility of implementing CLIC as a project at CERN to follow the LHC was also on the table, there was less urgency for discussion because the ILC effort counts on an earlier start date. The panellists exchanged many views and opinions with the audience on how the ILC international programme could be financed and how regional priorities could be integrated into a consistent worldwide strategy for the LC. Combining extensive host-lab-based expertise together with resources from individual institutes around the world is a mandatory first step for LC construction, which will also require the development of links between projects, institutions, universities and industry in an ongoing and multifaceted approach.

SRF systems – the central technology for the ILC – have many applications, so a worldwide plan for distributing the mass production of the SRF components is necessary, with technology-transfer proceeding in parallel, in partnership with funding agencies. Another issue discussed related to the model of global collaboration between host/hub-laboratories and industry to build the ILC, where each country shares the costs and human resources. Finally, accelerator and detector developments for the LC have already penetrated many areas of science. The question is how to improve further the transfer of technology from laboratories, so as to develop viable, on-going businesses that serve as a general benefit to society; as in the successful examples, such the IARC facility and the PET-TOF detector, presented in Anaheim.

Last, but not least, this technology-oriented symposium would have been impossible without the tireless efforts of the “Special LC Event” programme committee: Jim Brau, University of Oregon, Juan Fuster (IFIC Valencia), Ingrid-Maria Gregor (DESY Hamburg), Michael Harrison (BNL), Marc Ross (FNAL), Steinar Stapnes (CERN), Maxim Titov (CEA Saclay), Nick Walker (DESY Hamburg), Akira Yamamoto (KEK) and Hitoshi Yamamoto (Tohoku University). In all, this event was considered a real success. More than 90% of participants who answered the conference questionnaire rated it extremely important.

The IEEE tradition

The 2012 Institute of Electrical and Electronics Engineers (IEEE) Nuclear Science Symposium (NSS) and Medical Imaging Conference (MIC) together with the Workshop on Room-Temperature Semiconductor X-Ray and Gamma-Ray Detectors took place at the Disneyland Hotel, Anaheim, California, on 29 October – 3 November. Having received over 850 NSS abstracts – a record number of NSS submissions for the conferences in North America – the 2012 IEEE NSS/MIC Symposium attracted more than 2200 attendees. The NSS series, which started in 1954, offers an outstanding opportunity for detector physicists and other scientists and engineers interested in the fields of nuclear science, radiation detection, accelerators, high-energy physics, astrophysics and related software. During the past decade the symposium has become the largest annual event in the area of nuclear and particle-physics instrumentation, providing an international forum to discuss the science and technology of large-scale experimental facilities at the frontiers of research.

Lori Ann White, SLAC.

The incurable attraction of physics

A noble gas, a missing scientist and an underground laboratory. It could be the starting point for a classic detective story. But a love story? It seems unlikely. However, add in a back-story set in Spain during General Franco’s rule, plus a “eureka” moment in California, and the ingredients are there for a real romance – all of it rooted firmly in physics.

CCgom1_01_13

When Spanish particle-physicist Juan José Gómez Cadenas arrived at CERN as a summer student, the passion that he already had for physics turned into an infatuation. Thirty years later and back in his home country, Gómez Cadenas is pursuing one of nature’s most elusive particles, the neutrino, by looking where it is expected not to appear at all – in neutrinoless double-beta decay. Moreover, fiction has become entwined with fact, as he was recently invited to write a novel set at CERN. The result, Materia Extraña (Strange matter), is a scientific thriller that has already been translated into Italian.

Critical point

“Particle physicists were a rare commodity in Spain when the country first joined CERN in 1961,” Cecilia Jarlskog noted 10 years ago after a visit to “a young and rapidly expanding community” of Spanish particle physicists. Indeed, the country left CERN in 1969, when Juan was only nine years old and Spain was still under the Franco regime. Young Juan – or “JJ” as he later became known – initially wanted to become a naval officer, like his father, but in 1975 he was introduced to the wonders of physics by his cousin; Bernardo Llanas had just completed his studies with the Junta de Energía Nuclear (the forerunner of CIEMAT, the Spanish research centre for energy, the environment and technology) at the same time as Juan Antonio Rubio, who was to do so much to re-establish particle physics in Spain. The young JJ set his sights on the subject – “Suddenly the world became magic,” he recalls, “I was lost to physics” – and so began the love affair that was to take him to CERN and, in a strange twist, to write his first novel.

The critical point came in 1983. JJ was one of the first Spanish students to gain a place in CERN’s summer student programme when his country rejoined the organization. It was an amazing time to be at the laboratory: the W and Z bosons had just been discovered and the place was buzzing. “I couldn’t believe this place, it was the beginning of an absolute infatuation,” he says. That summer he met two people who were to influence his career: “My supervisor, Peter Sonderegger, with whom I learnt the ropes as an experimental physicist, and Luis Álvarez-Gaume, a rising star who took pity on the poor, hungry fellow-Spaniard hanging around at night in the CERN canteen.” After graduating from Valencia University, JJ’s PhD studies took him to the DELPHI experiment at CERN’s Large Electron–Positron collider. With the aid of a Fulbright scholarship, he then set off for America to work on the Mark II experiment at SLAC. From there it was back to CERN and DELPHI again, but in 1994 he left once more for the US, this time following his wife, Pilar Hernandez, to Harvard. An accomplished particle-physics theorist, she converted her husband to her speciality, neutrino physics, thus setting him on the trail that would lead him through the NOMAD, HARP and K2K experiments to the challenge of neutrinoless double-beta decay.

The neutrinoless challenge

Established for 15 years as professor of physics at the Institute of Nuclear and Particle Physics (IFIC), a joint venture between the University of Valencia and the Spanish research council (CSIC), he is currently leading NEXT – the Neutrino Experiment with a Xenon TPC. The aim is to search for neutrinoless double-beta decay using a high-pressure xenon time-projection chamber (TPC) in the Canfranc Underground Laboratory in the Spanish Pyrenees. JJ believes that the experiment has several advantages in the hunt for this decay mode, which would demonstrate that the neutrino must be its own antiparticle, as first proposed by Ettore Majorana (whose own life ended shrouded in mystery). The experiment uses xenon, which is relatively cheap and also cheap to enrich because it is a nobel gas. Moreover, NEXT uses gaseous xenon, which gives 10 times better energy resolution for the decay electrons than the liquid form. By using a TPC, it also provides a topological signature for the double-beta decay.

The big challenge was to find a way to amplify the charge in the xenon gas without inducing sparks. The solution came when JJ talked to David Nygren, inventor of the TPC at Berkeley. “It was one of those eureka moments,” he recalls. “Nygren proposed using electroluminescence, where you detect light emitted by ionization in a strong field near the anode. You can get 1000 UV photons for each electron. He immediately realized that we could get the resolution that way.” JJ then came up with an innovative scheme to detect those electrons in the tracking plane using light-detecting pixels (the silicon photomultiplier) – and the idea for NEXT was born. “It is hard for me not to believe in the goddess of physics,” says JJ. “Every time that I need help, she sends me an angel. It was Abe Seiden in California, Gary Feldman in Boston, Luigi di Lella and Ormundur Runolfson at CERN, Juan Antonio Rubio in Spain … and then Dave. Without him, I doubt NEXT would have ever materialized.” The collaboration now involves not only Spain and the US but also Colombia, Portugal and Russia. The generous help of a special Spanish funding programme, called CONSOLIDER-INGENIO, provided the necessary funds to get it going. “More angels came to help here,” he explains, “all of them theorists: José Manuel Labastida, at the time at the ministry of science, Álvaro de Rújula, my close friend Concepción González-García … really, the goddess gave us a good hand there.”

Despite the financial problems in Spain, JJ says that “there is a lot of good will” in MINECO, the Ministry of Economy, which currently handles science in Spain. He points out that there has already been a big investment in the experiment and that there is full support from the Canfranc Laboratory. He is particularly grateful for the “huge support and experience” of Alessandro Bettini, the former director of the Gran Sasso National Laboratory in Italy, who is now in charge at Canfranc. JJ finds Bettini and Nygren – both in their mid-seventies – inspirational characters, calling them the “Bob Dylans” of particle physics. Indeed, he set up an interview with both of them for the online cultural magazine, Jotdown – where he regularly contributes with a blog called “Faster than light”.

In many ways, JJ’s trajectory through particle physics is similar to that of any talented, energetic particle physicist pursuing his passion. So what about the novel? When did an interest in writing begin? JJ says that it goes back to when his family eventually settled in the town of Sagunto, near Valencia, when he was 15. An ancient city where modern steel-making stands alongside Roman ruins, he found it “a crucible of ideas”, where writers and artists mingled with the steel-workers, who wanted a more intellectual lifestyle for their children – especially after the return of democracy with the new constitution in 1978, following Franco’s death. JJ started writing poetry while studying physics in Sagunto, and when physics took him to SLAC in 1986, as a member of Stanford University, he was allowed to sit in on the creative-writing workshop. “I was not only the only non-native American but also the only physicist,” he recalls. “I’m not sure that they knew what to make of me.” Years later, he continued his formal education as a writer at the prestigious Escuela de Letras in Madrid.

A novel look at CERN

Around 2003, CERN was starting to become bigger news, with the construction of the LHC, experiments on antimatter and an appearance in Dan Brown’s mystery-thriller Angels & Demons. Having already written a book of short stories, La agonía de las libélulas (Agony of the dragonflies), published in 2000, JJ was approached by the Spanish publisher Espasa to write a novel that would involve CERN. Of course, the story would require action but it would also be a personal story, imbued with JJ’s love for the place. Materia Extraña, published in 2008, “deals with how someone from outside tries to come to grips with CERN,” he explains, “and also with the way that you do science.” It gives little away to say that at one and the same time it is CERN – but not CERN. For example, the director-general is a woman, with an amalgam of the characteristics that he observes to be necessary for women to succeed in physics. “The novel was presented in Madrid by Rubio,” says JJ. “At the time, we couldn’t guess he had not much time left.” (Rubio was to pass away in 2010.)

When asked by Espasa to write another book, JJ turned from fiction to fact and the issue of energy. Here he encountered “a kind of Taliban of environmentalism” and became determined to argue a more rational case. The result was El ecologista nuclear (The Nuclear Environmentalist, now published in English) in which he sets down the issues surrounding the various sources of energy. Comparing renewables, fossil fuels and nuclear power, he puts forward the case for an approach based on diversity and a mixture of sources. “The book created a lot of interest in intellectual circles in Spain,” he says. “For example, Carlos Martínez, who was president of CSIC and then secretary of state (second to the minister) liked it quite a bit. Cayetano López, now director of CIEMAT, and an authority in the field, was kind enough to present it in Madrid. It has made some impact in trying to put nuclear energy into perspective.”

So how does JJ manage to do all of this while also developing and promoting the NEXT experiment? “The trick is to find time,” he reveals. ‘We have no TV and I take no lunch, although I go for a swim.” He is also one of those lucky people who can manage with little sleep. “I write generally between 11 p.m. and 2 a.m.,” he explains, “but it is not like a mill. I’m very explosive and sometimes I go at it for 12 hours, non-stop.”

He is now considering writing about nuclear energy, along the lines of the widely acclaimed Sustainable Energy – without the hot air by Cambridge University physicist David MacKay, who is currently the chief scientific adviser at the UK’s Department of Energy and Climate Change. “The idea would be to give the facts without the polemic,” says JJ, “to really step back.” He has also been asked to write another novel, this time aimed at young adults, a group where publisher Espasa is finding new readers. While his son is only eight years old, his daughter is 12 and approaching this age group. This means that he is in touch with young-adult literature, although he finds that at present “there are too many vampires” and admits that he will be “trying to do better”. That he is a great admirer of the writing of Philip Pullman, the author of the bestselling trilogy for young people, His Dark Materials, can only bode well.

• For more about the NEXT experiment see the recent CERN Colloquium by JJ Gómez Cadenas at http://indico.cern.ch/conferenceDisplay.py?confId=225995. For a review of El ecologista nuclear see the Bookshelf section of this issue.

The incomprehensibility principle

Educators and psychologists invented the term “attention span” to describe the length of time anyone can concentrate on a particular task before becoming distracted. It is a useful term but span, or duration, is only one aspect of attention. Attention must also have an intensity – and the two variables are independent of each other. Perhaps one can postulate an analogue of the Heisenberg uncertainty principle, in which the intensity of attention multiplied by its span cannot exceed some fixed value. I call this the “incomprehensibility principle” and I have had plenty of opportunities to observe its consequences.

CCvie1_01_13

In the hands of skilled presenters, information can be carefully packaged as entertainment so that the attention needed to digest it is minimal. The trick is to mask the effort with compelling emotional appeal and a floppy boy-band haircut. However, the need to pay attention is still there; in fact, absorbing even the most trivial information demands a modicum of attention. How many of us, when leaving a cinema, have had the nagging feeling that although the film made great entertainment some details of the plot remained less than crystal clear?

The existence of a minimum level of attention suggests that it is, in some sense, a quantum substance. This means that under close examination, any apparently continuous or sustained effort at paying attention will be revealed as a series of discrete micro-efforts. However, while attention can be chopped up and interleaved with other activities, even tiny pulses of attention demand full concentration, to the exclusion of all other voluntary activities. Any attempt at multitasking, such as using a mobile phone while driving a car, is counterproductive.

The incomprehensibility principle plays a major role in education, where it is closely linked to the learning process. Because of the subject matter and/or the teacher, some school lessons require more time to assimilate than others. This trend accelerates in higher education. In my case, a hint of what was to come appeared during my third year of undergraduate physics, when I attended additional lectures on quantum mechanics in the mathematics department at Imperial College London.

My teacher was Abdus Salam, who went on to share the Nobel Prize for Physics in 1979. Salam’s lectures were exquisitely incomprehensible; as I look back, I realize he was probably echoing his own experiences at Cambridge some 15 years earlier at the hands of Paul Dirac. But he quickly referred us to Dirac’s book, The Principles of Quantum Mechanics. At a first and even a second glance, this book shone no light at all but after intense study, a rewarding glimmer of illumination appeared out of the darkness.

Motivated by Salam’s unintelligibility, I began postgraduate studies in physics only to find that my previous exposure to incomprehensibility had been merely an introduction. By then, there were no longer any textbooks to fall back on and journal papers were impressively baffling. With time, though, I realized that – like Dirac’s book – they could be painfully decrypted at “leisure”, line by line, with help from enlightened colleagues.

The real problem with the incomprehensibility principle came when I had to absorb information in real time, during seminars and talks. The most impenetrable of these talks always came from American speakers because they were, at the time, wielding the heavy cutting tools at the face of physics research. Consequently, I developed an association between incomprehensibility and accent. This reached a climax when I visited the US, where I always had the feeling that dubious characters hanging out at bus stations and rest stops must somehow be experts in S-matrix theory and the like, travelling from one seminar to the next. Several years later, when I was at CERN, seminars were instead delivered in thick European accents and concepts such as “muon punch-through” became more of an obstacle when pointed out in a heavy German accent.

Nevertheless, I persevered and slowly developed new skills. The incomprehensibility principle cannot be bypassed but even taking into account added difficulties such as the speaker’s accent or speed of delivery – not to mention bad acoustics or poor visual “aids” – it is still possible to optimize one’s absorption of information.

One way of doing this is to monitor difficult presentations in “background mode”, paying just enough attention to follow the gist of the argument until a key point is about to be reached. At that moment, a concerted effort can be made to grab a vital piece of information as it whistles past, before it disappears into the obscurity of the intellectual stratosphere. The trick is to do this at just the right time, so that each concentrated effort is not fruitless. “Only cross your bridges when you come to them”, as the old adage goes.

By adopting this technique, I was able to cover frontier meetings on subjects of which I was supremely ignorant, including microprocessors, cosmology and medical imaging, among others. Journalists who find themselves baffled at scientific press conferences would do well to follow my example, for the truth is that there will always be a fresh supply of incomprehensibility in physics. Don’t be disappointed!

Gordon Fraser. Gordon, who was editor of CERN Courier for many years, wrote this as a ‘Lateral Thought’ for Physics World magazine but died before the article could be revised (see obituary). It was completed by staff at Physics World and is published in both magazines this month as a tribute.

CERN becomes UN observer

CCnew2_01_13

On 14 December, the UN General Assembly adopted a resolution to allow CERN to participate in the work of the General Assembly and to attend its sessions as an observer. With this new status, the laboratory can promote the essential role of basic science in development.

In a meeting with UN secretary-general, Ban Ki-moon, on 17 December, CERN’s director-general, Rolf Heuer, pledged that CERN was willing to contribute actively to the UN’s efforts to promote science, in particular UNESCO’s initiative “Science for sustainable development”. Ban Ki-moon, left, with Rolf Heuer.

Europe launches consortium for astroparticle physics

At the end of November, European funding agencies for astroparticle physics launched a new sustainable entity, the Astroparticle Physics European Consortium (APPEC). This will build on the successful work of the European-funded network, the AStroParticle European Research Area (ASPERA).

Over the past six years, ASPERA has brought together funding agencies and the physics community to set up European co-ordination for astroparticle physics. It has developed common R&D calls and created closer relationships to industry and other research fields. Above all, ASPERA has developed a European strategy for astroparticle physics to prioritize the large infrastructures needed to solve universal mysteries in concerning, for example, neutrinos, gravitational waves, dark matter and dark energy.

APPEC now plans to develop a European common action plan to fund the upcoming large astroparticle-physics infrastructures as defined in ASPERA’s road map. Ten countries have already joined the new APPEC consortium, with nine others following the accession process. APPEC’s activities will be organized through three functional centres, located at DESY, the Astronomy, Particle Physics and Cosmology laboratory of the French CNRS/CEA, and the INFN’s Gran Sasso National Laboratory. Stavros Katsanevas of CNRS has been elected as chair of APPEC and Thomas Berghoefer of DESY as general secretary.

• APPEC is the Astroparticle Physics European Consortium. It currently comprises 10 countries represented by their Ministries, funding agencies or their designated institution: Belgium (FWO), Croatia (HRZZ), France (CEA, CNRS), Germany (DESY), Ireland (RIA), Italy (INFN), The Netherlands (FOM), Poland (NCN), Romania (IFIN) and the UK (STFC).

Quantum Gravity (Third Edition)

By Claus Kiefer
Oxford University Press
Hardback: £65 $117

41tdNNsxtoL
The search for a quantum theory of the gravitational field is one of the great open problems in theoretical physics. This book covers the two main approaches to its construction – the direct quantization of Einstein’s general theory of relativity and string theory. There is a detailed presentation of the main approaches used in quantum general relativity: path-integral quantization, the background-field method and canonical quantum gravity in the metric, connection and loop formulations.
bright-rec iop pub iop-science physcis connect