Comsol -leaderboard other pages

Topics

Inside Story: Recounting fond memories of when DESY first began

Do you remember? If you are old enough you certainly will. I refer to the sixth decade of last century, when the research centres CERN and DESY were created. About that time I tried to explain to my sister Jutta (an artist who always considered logarithms as some species of worms) our understanding of the structure of matter. I started with the usual story about all visible matter being made of molecules which in turn are composed of atoms. And all atoms are made of very small particles called protons, neutrons and electrons. I even tried to explain some details on nuclear, electromagnetic and gravitational forces; three basic particles and three forces, an elegant and simple scheme. I left out solar energy and radioactivity.

But Jutta was not happy. In the early 1950s she came with us to the Andes mountains to expose nuclear emulsions in which we searched for cosmic mesons and hyperons. Jutta was also an attentive observer during the many evenings that I spent with Gianni Puppi in the ancient building of the Physics Institute of Bologna, scanning bubble-chamber pictures provided from the US by Jack Steinberger. We were looking for so called Λ and θ particles, trying to learn about their spin and some difficult-to-understand parity violation. So Jutta knew that there were many more particles and effects in existence, which I could not explain to her.

And, at a certain point, we particle physicists did not like the situation either. Our initial excitement with the discovery of exotic particles did not last long. We were not pleased with the several hundred particles and excited states (most of them unstable) that had been found but which did not fit into our traditional scheme of the structure of stable matter. There was no good reason for them to exist. It seemed at a certain moment quite useless to continue adding more and more particles to this “particle zoo” as it was condescendingly called. We were just making a kind of “particle spectroscopy” with no visible goal in mind.

In addition, at that time we had already been forced to abandon our beloved organization in small university groups, each one proud of their individual discoveries. Now, it was often the case that several of these groups had to join forces to reach significant results. One extreme example was a collaboration of about a hundred physicists on a single project to expose an enormous emulsion stack in the higher atmosphere and subsequently to undertake its inspection. Results were published with more than a hundred authors on a single paper, a kind of horror vision for individualists. It was the beginning of the international globalization of research, initiated (as are so many other issues) by particle physicists.

But none of this helped us understand the particle zoo. There was general agreement that new ways should be found, perhaps by the systematic study of reactions at higher energies. It was in this period that the European research centre CERN was created in 1954. Other local accelerator projects were started in a number of countries too, some of which were designed as a complement to the planned proton accelerator at CERN. A group of German physicists were dreaming about an electron machine, and this led to the foundation of DESY in Hamburg exactly 50 years ago.

However, life for electron-accelerator enthusiasts was not easy. While most particle physicists agreed about building proton machines, several did not accept the idea of working with electrons. I remember serious claims that everything related to electrons and electric charges could be accurately calculated within the framework of quantum electrodynamics. Consequently nothing new could be learnt from experimenting with electrons. Fortunately this was wrong!

The results of the following 50 years of global research are well known. Single papers are now often signed by more than a thousand authors and our understanding of the inner structure of matter has improved by a factor of thousand. The existence of most of the particles of our zoo can be understood and their inner structure has been explained (including our protons and neutrons). Quarks and leptons as basic particles and several fundamental forces with their exchange quanta form an elegant scheme called the “Standard Model of particle physics”. There are still some problems to solve, but I did try again to explain the basics to my sister Jutta. She illustrated her feelings after our last discussion.

Nobel for optical fibres and CCDs

Charles Kao, who worked at Standard Telecommunication Laboratories, Harlow, UK, and was vice-chancellor of the Chinese University of Hong Kong, recieves the 2009 Nobel Prize in Physics for “groundbreaking achievements concerning the transmission of light in fibres for optical communication”. Kao’s studies indicated in 1966 that low-loss fibres should be possible using high-purity glass, which he proposed could form waveguides with high information capacity.

Willard Boyle and George Smith, who worked at Bell Laboratories, Murray Hill, New Jersey, share the other half of the prize “for the invention of an imaging semiconductor circuit – the CCD sensor”. They sketched out the structure of the CCD in 1969, their aim being better electronic memory – but they went on to revolutionize photography.

ATLAS and CMS collect cosmic-event data…

CCnew5_09_09

The ATLAS collaboration has made the most of the long shutdown of the LHC by undertaking a variety of maintenance, consolidation and repair work on the detector. as well as major test runs with cosmic rays. The crucial repairs included work on the cooling system for the inner detector, where vibrations of the compressor caused structural problems. The extended shutdown also allowed some schedules to be brought forward. For instance, the very forward muon chambers have been partially installed, even though this was planned for the 2009/10 shutdown. The collaboration has also undertaken several upgrades to prepare for higher luminosity, such as the replacement of optical fibres on the muon systems in preparation for higher radiation levels.

CCnew6_09_09

In parallel, the analysis of cosmic data collected last year has allowed the collaboration to perform detailed alignment and calibration studies, achieving a level of precision far beyond expectations for this stage of the experiment. This work is set to continue, in particular from 12 October, when the ATLAS Control Room is to be staffed round the clock. The experiment will collect cosmic data continuously until first beam appears in the LHC. During this time, the teams will study the alignment, calibration, timing and performance of the detector.

CMS has also been making the most of testing with cosmic rays. During a five-week data-taking exercise starting on 22 July, the experiment recorded more than 300 million cosmic events with the magnetic field on. This large data-set is being used to improve further the alignment, calibration and performance of the various sub-detectors in the run up to proton–proton collisions.

As with the other experiments, the shutdown period provided the opportunity for consolidation work on the detector. One of the most important items in CMS was the complete refurbishment of the cooling system for the tracker. The shutdown also gave the collaboration a chance to install the final sub-detector, the pre-shower, which consists of a lead–silicon “sandwich” with silicon-strip sensors only 2 mm wide. The pre-shower, which sits in front of the endcap calorimeters, can pinpoint the position of photons more accurately than the larger crystal detectors in the endcaps. This will allow a distinction to be made between two low-energy photons and one high-energy photon – crucial for trying to spot certain kinds of Higgs-boson decay.

When LEP, CERN’s first big collider, saw beam

CClep1_09_09

On 13 November 1989, heads of state, heads of government and ministers from the member states assembled at CERN together with more than a thousand invited guests for the inauguration of the Large Electron–Positron (LEP) collider (PS and LEP: a walk down memory lane.). Precisely one month earlier, on 13 October, large audiences had packed CERN’s auditorium and also taken advantage of every available closed-circuit TV to see the presentation of the first results from the four LEP experiments, ALEPH, DELPHI, L3 and OPAL – results that more or less closed the door on the possibility that a fourth type of neutrino could join those that were already known. This milestone came only two months after the first collisions on 13 August and three months after beam had circulated around LEP for the first time.

Champagne corks had already popped the previous summer, soon after 23.55 p.m. on 12 July 1988, when four bunches of positrons made the first successful journey between Point 1, close to CERN’s main site at Meyrin (Switzerland) and Point 2 in Sergy (France) – a distance of 2.5 km through much of the first of eight sectors of the 27-km LEP ring. It was a heady moment and the culmination of several weeks of final hardware commissioning. Elsewhere, the tunnel was still in various stages of completion, the last part of the difficult excavation under the Jura having been finished only five months earlier.

A year to do it all

Steve Myers led the first commissioning test and a week later he reported to the LEP Management Board, making the following conclusions: “It worked! We learnt a lot. It was an extremely useful (essential) exercise – exciting and fun to do. The octant behaved as predicted theoretically.” This led to the observation that, “LEP will be more interesting for higher-energy physics than for accelerator physics!”. However, he also warned, “We should not be smug or complacent because it worked so well! Crash testing took 4 months for about a tenth of LEP; at the same rate of testing the other nine tenths will require 36 months.” Yet the full start-up was already pencilled in for July 1989, in only 12 months’ time.

The following months saw a huge effort to install all of the equipment in the remaining 24 km of the tunnel – magnets, vacuum chambers, RF cavities, beam instrumentation, control systems, injection equipment, electrostatic separators, electrical cabling, water cooling, ventilation etc. This was followed by the individual testing of 800 power converters and connecting them to their corresponding magnets while carefully ensuring the correct polarity. In parallel, the vacuum chambers were baked out at high temperature and leak-tested. The RF units, which were located at interaction-regions 2 and 6, were commissioned and the cavities conditioned by powering them to the maximum of 16 MW. Much of this had to be co-ordinated carefully to avoid conflicts between testing and installation work in the final sector, sector 3-4. At the same time a great deal of effort – with limited manpower – went into preparing the software needed to operate the collider, in close collaboration with the accelerator physicists and the machine operators.

The goal for the first phase of LEP was to generate electron–positron collisions at a total energy of around 90 GeV, equivalent to the mass of the Z0, the neutral carrier of the weak force. It was to be a veritable Z0 factory, delivering Z0s galore to make precision tests of the Standard Model of particle physics – which it went to do with outstanding success.

To “mass produce” the Z0s required beams not only of high energy, but also of high intensity. To deliver such beams required four major steps. The first was the accumulation of the highest possible beam current at the injection energy of 20 GeV, from the injection chain. (This was itself a major operation involving the purpose-built LEP Injection Linac (LIL) and Electron–Positron Accumulator (EPA), the Proton Synchrotron (PS), the Super Proton Synchrotron (SPS) and, finally, transfer lines to inject electrons and positrons in opposite directions, which curved not only horizontally but also vertically as LEP and the SPS were at different heights). The second step was to ramp up the accumulated current to the energy of the Z0, with minimal losses. Then, to improve the collision rate at the interaction regions the beam had to be “squeezed”, by reducing the amplitude of the betatron oscillations (beam oscillations about the nominal orbit) to a minimum value. Finally the cross-section of the beam had to be reduced at the collision points.

The first turn

In June 1989 the LEP commissioning team began testing the accelerator components piece by piece, while the rest of CERN’s accelerator complex continued as normal. Indeed, the small team found themselves running the largest accelerator ever built in what was basically a back room of the SPS Control Room at Prévessin.

The plan was to make two “cold check-outs” – without beam – on 7 and 14 July, with the target of 15 July for the first beam test. The cold check-out involved operating all of the accelerator components under the control of the available software, which proved important for debugging the complete system of hardware and software for energy ramping in particular. On 14 July, however, positrons were already available from the final link in injection chain – the SPS – and so the second series of tests turned into a “hot check-out”. Over a period of 50 minutes, under the massed gaze of a packed control room, the commissioning team coaxed the first beam round a complete circuit of the machine – one day ahead of schedule.

In the days that followed, the team began to commission the RF, essential for eventual acceleration in LEP. The next month proved crucial but exciting as it saw the transition from a single turn round the machine to a collider with beams stored ready for physics.

By 18 July the first RF unit was in operation, with the RF timed in correctly to “capture” the beam for 100 turns round the machine. Two days later, the Beam Orbit Monitoring system was put into action, which allowed the team to measure and correct the beam’s trajectory. Measurements showed that the revolution frequency was correct to around 100 Hz in 352 MHz, or equivalently, that LEP’s 27 km circumference was good to around 8 mm. Work then continued on measuring and correcting the “tune” of the betatron oscillations, so that by 23 July a positron beam was able to circulate with a measured lifetime – derived from the observed decay of the beam current – of 25 minutes. Then, following a day of commissioning yet more RF units, the first electrons were successfully injected to travel the opposite way round the machine on 25 July.

Now it was time to try to accumulate more injected beam in the LEP bunches and to see how this affected the vacuum pressure in the beam pipe. By 1 August the team was observing good accumulation rates and measured a record current of 500 μA for one beam. This was the first critical step towards turning LEP into a useful collider. The next would be to ramp up the energy of the beam.

The late evening of 3 August saw the first ramp from the injection energy of 20 GeV, step by step up to 42.5 GeV, when two RF units tripped. On the third attempt – at 3.30 a.m. on 4 August – the beam reached 47.5 GeV with a measured lifetime of 1 hour. Three days later, both electrons and positrons had separately reached 45.5 GeV. Then 10 August saw the next important step towards a good luminosity in the machine – an energy ramp to 47.5 GeV followed by a squeeze of the betatron oscillations.

In business

On 12 August LEP finally accumulated both electrons and positrons. The next day the beams were ramped and squeezed to 32 cm, yielding stable beams of 270 μA per beam. It was time to turn off the electrostatic separators that allowed the two beams to coast without colliding. The minutes passed and then, just after 11 p.m., Aldo Michelini, the spokesperson of the OPAL experiment, reported seeing the first collision. LEP was in business for physics.

So began a five-day pilot-physics run that lasted until 18 August. During this time various technical problems arose and the four experiments collected physics data for a total of only 15 hours. Nevertheless, the maximum luminosity achieved of 5 × 1028 cm–2s–1 was important for “debugging” the detector systems and allowed for the detection of around 20 Z0 particles at each interaction region.

A period of machine studies followed, allowing big improvements to be made in the collider’s performance and resulting in a maximum total beam current of 1.6 mA at 45.5 GeV with a squeeze to 20 cm. Then, on 20 September, the first physics run began, with LEP’s total energy tuned for five days to the mass peak for the Z0 and sufficient luminosity to generate a total of some 1400 Z0s in each experiment. A second period followed, this time with the energy scanned through the width of the Z0 at five different beam energies – at the peak and at ±1 GeV and ±2 GeV from the peak. This allowed the four experiments to measure the width of the Z0 and so announce the first physics results, on 13 October, only three months after the final testing of the accelerator’s components.

By the end of the year LEP had achieved a top luminosity of around 5 × 1030 cm–2s–1 – about a third of the design value – and the four experiments had bagged more than 30,000 Z0s each. The Z0 factory was ready to gear up for much more to come.

• Based on several reports by Steve Myers, including his paper at the second EPAC meeting, in Nice on 12–16 June 1990.

PS and LEP: a walk down memory lane

CCann1_09_09
CCann2_09_09
CCann3_09_09
CCann4_09_09
CCann5_09_09
CCann6_09_09
CCann7_09_09
CCann8_09_09
CCann9_09_09
CCann10_09_09
CCann11_09_09
CCann12_09_09
CCann13_09_09
CCann14_09_09
CCann15_09_09
CCann16_09_09
CCann17_09_09

CCann18_09_09

Roy Glauber casts a light on particles

CCint1_09_09

When Roy Glauber was a 12-year-old schoolboy he discovered the beauty of making optical instruments, from polarizers to telescopes. His mathematical skills stem from those early school days, when a teacher encouraged him to begin studying calculus on his own. When he progressed to Harvard in 1941 he was already a couple of years ahead and had absorbed a fair fraction of graduate-level studies by 1943, when he was recruited into the Manhattan Project at the age of 18. It was then that the erstwhile experimentalist began the transition to theoretician. Finding the experimental work rather less demanding than theory – “It seemed to depend on how to keep a good vacuum in a counter,” he recalls, “and I didn’t think I would do it any better” – he asked to join the Theory Division and was set to work on solving neutron-diffusion problems.

Following the war, Glauber gained his BSc and PhD from Harvard and after apprenticeships with Robert Oppenheimer in Princeton and Wolfgang Pauli in Zurich, he stood in for Richard Feynman for a year at Caltech and then settled back at Harvard in 1952. By this time, he says, “all of the interest was in nuclear physics studied through scattering experiments”. With increasing energies becoming available at particle accelerators, the wavelength associated with the incident particles was decreasing to nuclear dimensions and below. Viki Weisskopf and colleagues had already developed the cloudy crystal-ball model of the nucleus, which successfully described averaged neutron cross-sections, and Glauber believed that the idea could be extended. “I had this conviction that it ought to be possible to represent the nucleus as a semi-translucent ball, from 20 MeV up,” he recalls. However, what the optical models lacked, in Glauber’s view, “was a proper quantitative derivation based on the scattering parameters of individual nucleons”.

Inspired by work on electron diffraction by molecules that he had pursued at Caltech, Glauber began to think about how to apply optical Fraunhofer-diffraction theory to higher-energy nuclear collisions – in a sense, bringing about a fusion of two of his interests. At higher energies, he argued, individual collisions could be treated diffractively and allow nuclear calculations to be based on the familiar ground of optical-diffraction theory.

The result was a generalized nuclear diffraction theory, in which he introduced charges and internal co-ordinates that did not exist in the optical case, such as spin and isospin, and dealt with scattering from nuclei that contained many nucleons by treating arbitrary numbers of successive collisions. The key was to consider energy transfers that were small compared with the incident energy. This was a reasonable assumption at higher energies and it led to a useful approximation method that provided a mathematical development of the original optical model, and allowed treatment of the preponderance of inelastic transitions.

CCint2_09_09

The theory turned out to work quite well for proton–deuteron and proton–helium collisions in experiments at the Cosmotron at Brookhaven. “You could see single and double scattering in the deuteron and helium,” he explains, “and shadowing” – where target nucleons lie in the shadow of others. However, at the time there were no studies of heavier nuclei.

Glauber made the first of many visits to CERN in 1964 and arrived for a six-month sabbatical in February 1967. “It was a most dramatic time for me,” he recalls. The group led by Giuseppe Cocconi had begun measurements of proton scattering from nuclear targets using the first extracted-proton beam from the PS. They made a series of measurements at 19.3 GeV/c but with the resolution of the spectrometer limited to 50 MeV, they could not separate elastic from inelastic scattering. Glauber realized that, extended to inelastic scattering, the theory would cover essentially all nuclear excitations in which there was no production of new particles. Together, the calculated elastic and inelastic cross-sections agreed exactly with what Cocconi’s group was measuring. Glauber presented the results of his work with Giorgio Matthiae of Cocconi’s group at a meeting in Rehovot in the spring of 1967. “We were doing quantitative high-energy physics for a change,” he says.

The work at CERN with Cocconi’s group left a big impression on Glauber: “It was something wonderful and inspiring.” He became “hooked on CERN”, returning many times for summers and sabbaticals, working on models for elastic scattering for experiments at the ISR and for UA4 on the SPS proton–antiproton collider. However, by the 1990s – the era of the Large Electron–Positron (LEP) collider – his visits became less frequent. “I found I had nothing new to say about LEP cross-sections,” he admits.

Today there is renewed interest in Glauber’s work, in particular among physicists involved with heavy-ion collisions. His early calculations of multiple diffraction laid the foundations for ideas that are central (in more ways than one) to studies in which nuclei collide at very high energies. The basic formalism of overlapping nucleons can be used to calculate the “centrality” of a collision – in other words, how head-on it is. However, other work in the field of optical theory also finds relevance in the unusual environment of heavy-ion collisions – in this case Glauber’s work on a quantum theory of optical coherence, which led to his share of the Nobel prize in 2005.

This work again dates back to the late 1950s and the discovery by Robert Hanbury-Brown and Richard Twiss of correlations in the intensities measured by two separated photon detectors observing the same light source. Their ultimate aim had been to extend their pioneering work on intensity interferometry at radio wavelengths to the optical region, so as to measure the angular sizes of stars – which they went on to do for Sirius and others. However, they first set up an experiment in the laboratory to reassure themselves that the technique would work at optical wavelengths. The result was surprising: light quanta have a significant tendency to arrive in pairs, with a coincidence rate that approaches twice that of the random background level. Extending the idea led to predictions that a laser source, with its narrow bandwidth, should show a large correlation effect. Glauber was sceptical, so he embarked on a proper quantum-theoretical treatment of the statistics of photon detection.

“Correlated pairs are characteristic of unco-ordinated chaotic emission from lots of sources,” he explains, “where the statistics are Gaussian. This is not a characteristic of light from a laser where all of the atoms know quite well what the other atoms are doing.” He realized correctly that this co-ordination means that there should be no Hanbury-Brown–Twiss correlation for a laser source and he went on to lay down the theoretical ground work for the field of quantum optics – the work that led to the Nobel prize.

There are similarities between the statistics in the detection of photons (bosons) and those of the detection of pions (also bosons) in heavy-ion collisions. The energetic collision should be like a thermal light source, with correlated pion emission akin to the Hanbury-Brown–Twiss correlations allowing the possibility of measuring the size of the source, as in the astronomical studies. Experiments do find such an effect but they do not see the full factor of two above the random background and the reason is yet to be properly understood. While the width of the measured peak may relate to the radius of the source, “we don’t have a theory of the radiation process that explains fully the correlation”, says Glauber, “no real quantitative explanation. Perhaps other things are upsetting the correlations.”

The LHC will explore further the realm of heavy-ion collisions and push on with measurements of the proton–proton total cross-section, a focus of the TOTEM experiment. While these links remain between his work and CERN, Glauber observes that the laboratory has changed a great deal since his first visits, but he is still “very devoted to the place as an ideal”. What then, does he hope in general for the LHC? “Pray to find a surprise,” he says. “It may be difficult to design an experiment to detect what you least expect, but we really need some surprises.”

• For Roy Glauber’s colloquium at CERN on 6 August, see http://indico.cern.ch/conferenceDisplay.py?confId=62811.

LEP – The Lord of the Collider Rings at CERN, 1980–2000: The Making, Operation and Legacy of the World’s Largest Scientific Instrument

By Herwig Schopper, Springer. Hardback ISBN 9783540893004 €39.95 (£36.99, $59.95). Online version ISBN 9783540893011.

CCboo1_09_09

Herwig Schopper’s energy and vitality remain undimmed, even though he turned 85 this year (CERN honours Schopper at 85). His book surveys the two decades of the Large Electron–Positron (LEP) collider, extending far beyond his own reign as CERN director-general in the years 1981–88.

From the outset, Schopper criticizes historians who have spurned his offer of first-hand but anecdotal input, preferring conventional archives and minutes. He contends that such lack of imagination can obscure the full picture. Thus the book is at its best when he relates how CERN’s history was moulded rather than recorded. Nobody was taking minutes when Schopper had working breakfasts with influential council delegates. Another example is his nomination as CERN’s director-general, where Italy was initially pushing for its own candidate. The sequel came later, when he carefully stage-managed an extension to his mandate to oversee the construction of LEP through to completion.

Fierce debate centred on the parameters of LEP: its circumference, tunnel diameter, precise footprint and the energy of its beams. Overseeing LEP called for a high level of scientific statesmanship. It was the largest civil-engineering project in Europe prior to the Channel Tunnel. As well as the technical challenge of building such a large underground ring at CERN, close to the Jura mountains, there was the diplomatic and demographic challenge of doing so beneath an international border, running close to and under suburbs and villages.

Closer to home was the thorny problem of catering for the physicists clamouring to use the new machine. How many detectors would be needed? Who would build and operate them? Who would lead the teams? With so much at stake, and so much enthusiasm, there was a lot of pushing and shoving to scramble aboard.

Schopper inherited the proton–antiproton collider in CERN’s Super Proton Synchrotron ring and while LEP was being planned and built he presided over the laboratory during the historic discovery of the W and Z particles – the carriers of the electroweak force. He recalls how this fast-moving research called for some skilful moves. In the middle of all this, the UK’s prime minister Margaret Thatcher dropped in, accompanied by her husband – “an elder (sic) gentleman whom she treated with astonishing kindness,” writes Schopper.

Experience had shown that LEP had to be presented from the outside as an integral part of CERN’s basic programme. However, this meant that no new money would be available. CERN’s research activities had to be pruned, a decision that did not go down well everywhere. Equally controversial were some deft moves on CERN’s balance sheets, transferring money between columns earmarked for operations and investments.

While planning and construction of the machine was hectic, it was usually predictable, but in the middle of it all, CERN was caught unawares when the UK, one of its major contributors, suddenly menaced to pull out completely. To counter the threat, CERN had to undergo painful invasive examination by an external committee. Its final recommendations were difficult to swallow but left CERN leaner and sharper. Schopper’s inside account of this period is most revealing.

Probably the biggest LEP controversy came right at the end. With its beam energy boosted to the limit in 2000, LEP was beginning to show tantalizing hints of the long-awaited Higgs particle. But the CERN juggernaut is irresistible. Before it had completed its act, LEP was kicked off the stage by the LHC proton collider for which the tunnel had been presciently designed right from the start. Schopper describes the resulting criticism and points out that it would indeed be ironic if the LHC found the Higgs inside the energy range that was still being explored by LEP.

Making decisions is not easy: long-term advantages can demand short-term sacrifices. Political popularity is another luxury, but highly visible VIP visits do seem to boost an organization’s self-esteem. Most titillating is when Schopper puts LEP aside and reveals what went on behind the scenes to get the Pope, the Dalai Lama and other VIPs to visit CERN. The initial machinations and detailed planning for the visits of French presidents and prime ministers had to be abandoned when their last-minute changes called for frantic improvisation.

The cumbersomely titled The Lord of the Collider Rings is a valuable addition to particle-physics literature but it is mainly written for insiders. The names of people, machines and physics measurements tumble onto the page with little introduction. Schopper acknowledges that some of the illustrations are not optimal. This makes the book look as though it were hastily assembled and gives the CERN reader a sense of déjà vu, which is underlined by a statutory presentation of the Standard Model.

There are a few minor errors. Schopper naturally prefers the Germanic Wilhelm von Ockham to William of Occam, of eponymous razor fame, who was English (but died in Bavaria). Physics World is published by the UK Institute of Physics, not the “British Physical Society”. Furthermore, there is little mention of the Stanford Linear Collider, which briefly trod on LEP’s toes in 1989.

Schopper’s anecdotes and insider views are certainly better entertainment – and possibly more incisive – than a dry formal history. After his LEP revelations, one now looks forward to what his successors at CERN will say about the groundwork for the LHC (historians, please take note).

The Large Hadron Collider: a Marvel of Technology

by Lyndon Evans (ed), EPFL Press. Paperback ISBN 97829400222346, €45 (SFr69).

CCboo1_08_09

Edited by Lyn Evans, the LHC project leader, this book outlines in a well balanced manner the history, physics and technologies behind the most gigantic scientific experiment at CERN: the LHC accelerator and its detectors. The book describes the highlights of the LHC’s construction and the technologies developed and used for both the accelerator and the experiments. The 16 chapters are all written by leaders of activities within the LHC project. The timing is perfect because the book is on the shelf just in time for the anticipated start of LHC-physics data-taking.

There are thousands of people at CERN – from universities and collaborating institutions around the globe – who have accompanied the LHC project over the past two decades or joined during the construction phase. In this book they will find a superb record and detailed account of their own activities and the many aspects and challenges that their colleagues involved in the LHC construction had to face and solve. It features excellent photos that illustrate many of the ingenious technological inventions and show the detailed LHC infrastructure, components and experimental equipment installed both in the tunnel and above ground.

The interested readers will learn about the scientific questions and theory behind the LHC. The book presents in detail the scale, complexity and challenges inherent in the realization of this wonder of technology. Readers will gain an insight into the managerial and organizational aspects of long-term planning in present-day, large-scale science projects. They will learn much about superconductivity and superconducting magnets; industrial-scale cryogenic plants and cryogenics; ultra-high vacuum techniques; beam physics, injection, acceleration and dumping; as well as environmental protection and security aspects around the LHC. They will also read about the complex political processes behind the approval, funding, purchasing and construction of these enormous scientific experiments.

Colleagues involved in new, large-scale scientific projects in Europe – e.g. ITER, XFEL, FAIR, ESS – are well advised to benefit for their respective projects by reading this book. Many unforeseen problems faced during project execution, which required unconventional flexible measures to be adopted, are openly presented and discussed, with mention of the lessons to be learnt.

A significant part of the book is devoted to the description of the four major LHC experiments by their respective spokespersons and to the LHC data analysis and the Grid. The introduction is written by T S Virdee and provides a good overview of particle-detection basics, detector developments and challenges at the LHC. This section of the book is dedicated not only to the thousands of scientists, engineers and technicians involved in preparing LHC detectors worldwide but also – an interesting idea – to the agencies that funded the LHC detectors to a large extent.

In summary, this book comes at the right time and should be on the shelf of all friends of the LHC because it represents a nicely balanced record of the historical developments, technical challenges and scientific background. It is packed with many, many photos of the LHC taken during construction and assembly.

The Quantum Frontier: The Large Hadron Collider

by Don Lincoln, foreword by Leon Lederman, Johns Hopkins University Press. Hardback ISBN 9780801891441, $25.

CCboo2_08_09

As I write this review, in less than one week’s time I will be starting the second year of my physics-undergraduate degree at McGill University, Montreal. During the past summer I was granted the chance to spend time at CERN, an aspiration for every young physicist. Working as a student journalist for the CERN Bulletin, I was able to get away with asking enough questions to drive everyone mad and learnt a great deal about the various experiments currently (and previously) being conducted at CERN, in particular at the LHC. However, even after two months of constant probing, the LHC still held many more secrets and fantastic intricacies that I sought to understand.

It was only during my final weeks that the answers to these questions were found, by reading The Quantum Frontier. Don Lincoln’s playful, energetic style took me from the fundamentals of contemporary physics through to the extremely complex and sophisticated guts of the LHC experiments, touching on everything from the Earth’s “inevitable” destruction by black holes to speculated future physics experiments in a post-LHC era.

Cracking it open for the first time, I was worried that a book taking under 200 pages to cover such an ambitious topic would be riddled with sterile facts listed one after the other. But the contrary is what I found. Lincoln starts by addressing the obvious misconception that is in the watching world’s mind: will the LHC destroy the planet and all of us with it? Tackling this issue first with an overview of basic material often covered in high-school science classes (the components of the atom, etc.), Lincoln goes on to peer deeper and deeper into the world of particle physics, laying out the basic building blocks of matter and what the LHC hopes to discover.

As a student of the subject, I found that some of the material was familiar, while a great deal of the new ideas and theories were elegantly explained. Lincoln kept me happily engaged with poignant and often funny analogies that facilitated the explanations and catered for a concise understanding. Like any scientifically relevant book, it uses diagrams and graphs to elaborate ideas, but their inclusion is not daunting.

Being a particle physicist himself, Lincoln gives us a chance to see the world from such a perspective and conveys the excitement and awe that is experienced working in this field.

DOE allocates Fermilab an additional $60.2 million

In the latest instalment of funding from the US Department of Energy’s (DOE) Office of Science under the American Recovery and Reinvestment Act, Fermilab is to receive an additional $60.2 million to support research towards next-generation particle accelerators and preliminary designs for a future neutrino experiment.

The new funds are part of more than $327 million announced by Energy Secretary Steven Chu on 4 August from funding allocated under the Recovery Act to DOE’s Office of Science. Of these funds, $220 million will go towards projects at DOE national laboratories. While many of the physics-related projects are associated with fusion research or light sources, Fermilab and the Brookhaven National Laboratory have both received support for activities in high-energy physics.

Taking the stimulus funds announced earlier this year into account, the Recovery Act is allocating more than $100 million to Fermilab. Out of the additional $60.2 million announced in August, the laboratory will devote $52.7 million to research on next-generation accelerators using superconducting RF technology. The remaining $7.5 million will go to fund a preliminary design for a future neutrino experiment, in collaboration with Brookhaven, which has received $6.5 million for neutrino research in addition to $3 million for improvements to its light source.

bright-rec iop pub iop-science physcis connect