Comsol -leaderboard other pages

Topics

Deep-inelastic scattering enters the LHC era

Data-to-theory ratios

The unusually early date for the 20th International Workshop on Deep-Inelastic Scattering and Related Subjects proved not to be a problem. The trees were all in blossom in Bonn during DIS 2012, which was held there on 26–30 March, and the sun shone for most of the week. As is the tradition for these workshops, the first day consisted of plenary talks, with the ensuing three days devoted to parallel sessions, followed by a final day of summary talks from the seven working groups. Almost all of the 300 participants also gave talks: there were as many as 275 contributions, not including the summaries. For the first time, the number of results from the LHC experiments at CERN was larger than from DESY’s HERA collider, which shut down in 2007. Given such a large number of contributions, it is not possible to do justice to them all, so the following report presents only a few rather subjective highlights.

With the move from dominantly electron–proton collisions to more and more results coming from hadron colliders, the workshop started with an “Introduction to deep-inelastic scattering: past and present” by Joël Feltesse of IRFU/CEA/Saclay. Talks on theory and on experiment followed, which covered the full breadth of the topics presented in more detail in the parallel sessions. With running at Fermilab’s Tevatron coming to an end in 2011, results with the complete data set are now being released by the CDF and DØ collaborations. There were also several results from the LHC experiments based on the complete data set for 2011. The emphasis in many of the theory presentations was on calculating processes to higher orders and on parton density function (PDF) and scale uncertainties.

Structure functions and PDFs

Measurements that are relevant to the determination of the PDFs in the nucleon, were reported on combined data from the HERA experiments, H1 and ZEUS, the LHC experiments, ATLAS, CMS and LHCb, as well as from the Tevatron experiments, CDF and DØ. New experimental results have come – in particular from the LHC – on Drell-Yan production, including W and Z bosons, and from HERA and the LHC on jet production, including jets with heavy flavour. In addition, analyses of deep-inelastic scattering (DIS) data on nuclei were presented at the workshop.

There has been substantial progress in the development of tools for PDF fitting

There has been substantial progress in the development of tools for PDF fitting, including the so-called HERAfitter package. This package is designed to include all types of data in a global fit and can be used by both experimentalists and theorists to compare different theoretical approaches within a single framework. The FastNLO package can calculate next-to-leading-order (NLO) jet cross-sections on grids that can then be used for comparisons of data and theory, as well as in PDF fitting. Figure 1 shows a comparison of data and theory for many different energies and processes.

Looking at the current status of fit results, the conclusion is that the determination of the PDFs still gives rise to some controversy but that there is progress in understanding the differences, as Amanda Cooper-Sarkar of Oxford University explained. All of the groups presented PDFs up to next-to-next-to-leading order (NNLO) in the Dokshitzer-Gribov-Lipatov-Altarelli-Parisi formalism. Extensions of the formalism into the Balitsky-Fadin-Kuraev-Lipatov regime and into the high-density regime of nuclei are in progress. The H1 and ZEUS collaborations have also measured the longitudinal structure function, FL. However, the precision is still not good enough to discriminate between predictions of the gluon density and different models.

Measurements of cross-sections of diffractive processes in DIS open the opportunity to probe the parton content in the colourless nucleon, the goal being to determine diffractive PDFs of the nucleon. The H1 and ZEUS collaborations selected diffractive processes in DIS either by requiring the detection of a scattered proton in dedicated proton spectrometers (ZEUS-LPS or H1-FPS) at small angles to the direction of the proton beam, or without proton detection but instead requiring a large rapidity gap between the production of a jet or vector-meson and the proton beam direction. Figure 2 shows reduced cross-sections obtained from LPS and FPS data (and also combined), which were presented at the workshop. The LHC experiments have also started to contribute to diffraction studies. The ATLAS collaboration reported on an analysis of diffractive events selected by a rapidity gap.

Diffractive cross-section

Searches and tests

At the time of the conference, the LHC experiments had only tantalizing hints of an excess in the mass region around 125 GeV using the data from 2011. It was nevertheless impressive that many results could be shown using the full 5 fb–1 of data that had been collected that year. The Higgs searches were the only ones to show any real sign of new particles. All others saw no significant indications and could only set upper limits. Experiments at both the LHC and the Tevatron have now measured WW, ZZ and WZ production with cross-sections that are consistent with Standard Model expectations, calculated to NLO and higher.

As Feltesse reminded participants in his talk, measurements of hadronic final states in DIS were the cradle for the development of the theory of strong interactions, QCD. Such measurements remain key for testing QCD predictions. New results were presented from HERA and the LHC, in which the QCD analyses have reached an impressive level of precision. While leading-order-plus-parton-shower Monte Carlos provide a good description of the data in general, a number of areas can be identified where the description is not good enough. Higher-order generators are needed here and it is important that appropriate tunes are used.

In general, NLO QCD predictions give a good description of the data. However, the uncertainty in the theory because of missing higher-order calculations is almost everywhere much larger than the experimental errors. Moreover, it was shown that in several cases the fragmentation process of partons into hadrons is not well described by NLO QCD calculations.

A central issue is the value and precision of the strong coupling constant, αS, and its running as a function of the energy scale. Many results were presented that improve the precision and show that the energy dependence is well described by QCD calculations.

There has been a great deal of progress in calculations of heavy-quark production

There has been a great deal of progress in calculations of heavy-quark production. A particular highlight is the first complete NNLO QCD prediction for the pair-production of top quarks in the quark–antiquark annihilation channel. There is also a wealth of data from HERA, the LHC, the Tevatron and the Relativistic Heavy-Ion Collider (RHIC) on the production both of quarkonia and of open charm and beauty. The precision with which the Tevatron experiments can measure the masses of both the top quark and the W boson is particularly impressive. Although the LHC experiments have more events of both sorts by now, it will still take some time before the systematic uncertainties are understood well enough to achieve similar levels of precision.

The X,Y,Z states discovered in recent years have been studied by the experiments at B factories, the LHC and the Tevatron. Their theoretical interpretations are still a challenge. The LHCb experiment has performed the world’s best measurements of the properties of the Bc meson and b baryons and has made important contributions in other areas where its ability to measure particles in the forward direction is important.

Experiments that use polarized beams in DIS on polarized targets are relevant for studying the spin structure of nucleons. New results were presented from HERMES at HERA and COMPASS at CERN’s Super Proton Synchrotron, as well as from experiments at RHIC and Jefferson Lab. A tremendous amount of data has been collected and is now being analysed. Current results confirm that neither the quarks nor the gluons carry much of the nucleon spin. This leaves angular momentum. However, a picture describing the nucleon as a spatial object carrying angular momentum has yet to be settled.

The conceptual design report for a future electron–proton collider using the LHC together with a new electron accelerator, known as the LHeC, was released a couple of months after DIS 2012. This was the main topic of the last plenary talk at the workshop. In the parallel sessions, a broad spectrum of options for the future was discussed, covering the upgrades of the LHC machine and detectors, the upgrade plans at Jefferson Lab and RHIC, as well as proposed new accelerators such as an electron–ion collider, the EIC. One of the central aims is to understand better the 3D structure of the proton in terms of generalized parton distribution functions.

DIS 2012 participants once again profited from lively and intense discussions. The conveners of the working groups worked hard to put together informative and interesting parallel sessions. They also organized combined sessions for topics that were relevant for more than one working group. For relaxation, the workshop held the conference dinner in the mediaeval castle “Burg Satzvey”, which was a big success. Many of the participants also went on one of several excursions on offer. Next year, DIS moves south and will take place on 22–26 April in Marseilles.

SLAC at 50: honouring the past and creating the future

CCsla1_09_12

In the early 1960s, a 4-km-long strip of land in the rolling hills west of Stanford University was transformed into the longest, straightest structure in the world – a linear particle accelerator. It was first dubbed Project M and affectionately known as “the Monster” by the scientists at the time. Its purpose was to explore the mysterious subatomic realm.

CCsla2_09_12

Fifty years later, more than 1000 people gathered at SLAC National Accelerator Laboratory to celebrate the scientific successes generated by that accelerator and the ones that followed, and the scientists who developed and used them. The two-day event on 24–25 August, for employees, science luminaries and government and university leaders, was more than a tribute to the momentous discoveries and Nobel prizes made possible by the minds and machines at SLAC. It also provided a look ahead at the lab’s continuing evolution and growth into new frontiers of scientific research, which will keep it at the forefront of discovery for decades to come.

A history of discovery

The original linear-accelerator project, approved by Congress in 1961, was a supersized version of a succession of smaller accelerators, dubbed Mark I to Mark IV, which were built and operated at Stanford University and reached energies of up to 730 MeV. The “Monster” would accelerate electrons to much higher energies – ultimately to 50 GeV – for ground-breaking experiments in creating, identifying and studying subatomic particles. Stanford University leased the land to the federal government for the new Stanford Linear Accelerator Center (SLAC) and provided the brainpower for the project. This set the stage for a productive and unique scientific partnership that continues today, supported and overseen by the US Department of Energy.

CCsla3_09_12

Soon after the new accelerator reached full operation, a research team that included physicists from SLAC and Massachusetts Institute of Technology (MIT) used the electron beam in a series of experiments starting in 1967 that provided evidence for hard scattering centres within the proton – in effect, the first direct dynamical evidence for quarks. That research led to the awarding of the 1990 Nobel Prize in Physics to Richard Taylor and Jerome Friedman of SLAC and Henry Kendall of MIT.

CCsla4_09_12

SLAC soon struck gold again with discoveries that were made possible by another major technical feat – the Stanford Positron Electron Asymmetric Ring, SPEAR. Rather than aiming the electron beam at a fixed target, the SPEAR ring stored beams of electrons and positrons from the linear accelerator and brought them into steady head-on collisions.

CCsla5_09_12

In 1974, the Mark I detector at SPEAR, run by a collaboration from SLAC and Lawrence Berkeley National Laboratory, found clear signs of a new particle – but so had an experiment on the other side of the US. In what became known as the “November Revolution” in particle physics, Burton Richter at SLAC and Samuel Ting at Brookhaven National Laboratory announced their independent discoveries of the J/ψ particle, which consists of a paired charm quark and anticharm quark. They received the Nobel Prize in Physics for this work in 1976. Only a year after the J/ψ discovery, SLAC physicist Martin Perl announced the discovery of the τ lepton, a heavy relative of the electron and the first of a new family of fundamental building blocks. He went on to share the Nobel Prize in Physics in 1995 for this work.

CCsla6_09_12

These and other discoveries that reshaped understanding of matter were empowered by a series of colliders and detectors. The Positron–Electron Project (PEP), a collider ring with a diameter almost 10 times larger than SPEAR, ran during the years 1980–1990. The Stanford Linear Collider (SLC), completed in 1987, focused electron and positron beams from the original linac into micron-sized spots for collisions at a total energy of 100 GeV. Making thousands of Z bosons in its lifetime, the SLC hosted a decade of seminal experiments. It also pioneered the concepts behind the current studies for a linear electron–positron collider to reach energies in the region of 1 TeV.

PEP was followed by the PEP-II project, which included a set of two storage rings and operated in the years 1998–2008. PEP-II featured the BaBar experiment, which created huge numbers of B mesons and their antimatter counterparts. In 2001 and 2004, BaBar researchers and their Japanese colleagues at KEK’s Belle experiment announced evidence supporting the idea that matter and antimatter behave in slightly different ways, confirming theoretical predictions of charge-parity violation.

Synchrotron research and an X-ray laser

Notably, new research areas and projects at SLAC have often evolved as the offspring of the original linear accelerator and storage rings. Researchers at Stanford and SLAC quickly recognized that electromagnetic radiation generated by particles circling in SPEAR, while considered a nuisance to the particle collision experiments, could be extracted from the ring and used for other types of research. They developed this synchrotron radiation – in the form of beams of X-ray and ultraviolet light – as a powerful scientific tool for exploring samples at a molecular scale. This early research blossomed as the Stanford Synchrotron Radiation Project (SSRP), a set of five experimental stations that opened to visiting researchers in 1974.

Its modern descendant, the Stanford Synchrotron Radiation Lightsource (SSRL), now supports 30 experimental stations and about 2000 visiting researchers a year. SPEAR – or more precisely, SPEAR3 following a series of upgrades – became dedicated to SSRL operations 20 years ago. This machine, too, has allowed Nobel-prize winning research. Roger Kornberg, professor of structural biology at Stanford, received the Nobel Prize in Chemistry in 2006 for work detailing how the genetic code in DNA is read and converted into a message that directs protein synthesis. Key aspects of that research were carried out at the SSRL.

Cutting-edge facilities

Meanwhile, sections of the linear accelerator that defined the lab and its mission in its formative years are still driving electron beams today as the high-energy backbone of two cutting-edge facilities: the world’s most powerful X-ray free-electron laser, the Linac Coherent Light Source (LCLS), which began operating in 2009; and FACET, a test bed for next-generation accelerator technologies. LCLS-II, an expansion of the LCLS, should begin construction next year. It will draw electrons from the middle section of the original linear accelerator and use them to generate X-rays for probing matter with high resolution at the atomic scale.

CCsla8_09_12

The late Wolfgang “Pief” Panofsky, who served as the first director of SLAC from 1961 until 1984, often noted that big science is powered by a ready supply of good ideas. He referred to this as the “innovate or die” syndrome. In 1983, Panofsky wrote that he had been asked since the formation of the lab, “How long will SLAC live?” The answer was and still is: “about 10 to 15 years, unless somebody has a good idea. As it turns out, somebody always has had a good idea which was exploited and which has led to a new lease on life for the laboratory.”

Under the leadership of its past two directors – Jonathan Dorfan, who helped launch the BaBar experiment and the astrophysics programme, and Persis Drell, who presided over the opening of the LCLS – SLAC’s scientific mission has grown and diversified. In addition to its original focus on particle physics and accelerator science, SLAC researchers now delve into astrophysics, cosmology, materials and environmental sciences, biology, chemistry and alternative energy research. Visiting scientists still come by the thousands to use lab facilities for an even broader spectrum of research, from drug design and industrial applications to the archaeological analysis of fossils and cultural objects. Much of this diversity in world-class experiments is based on continuing modernizations at the SSRL and the unique capabilities of the LCLS.

CCsla9_09_12

SLAC’s scientists and engineers continue to collaborate actively in international projects – designing machines and building components, running experiments and sharing data with other accelerator laboratories in the US and countries around the globe, including China, France, Germany, Italy, Japan, Korea, Latin America, Russia, Spain and the UK. The lab’s long-standing collaboration with CERN provided an important spark in the formative years of the World Wide Web and led to SLAC’s launch of the first web server in the US. SLAC is also playing an important role in the ATLAS experiment at CERN’s LHC. In the area of synchrotron science, collaborations with US national laboratories and with overseas labs such as DESY in Germany and KEK in Japan have contributed greatly to the development of advanced tools and methodologies, with enormous scientific impact.

Expertise in particle detectors has even elevated the lab’s research into outer space. SLAC managed the development of the Large Area Telescope, the main instrument on board the Fermi Gamma-ray Space Telescope, which was launched into orbit in 2008 and continues to make numerous discoveries. The lab has also earned a role in building the world’s largest digital camera for an Earth-based observatory, the Large Synoptic Survey Telescope, with construction scheduled to begin in 2014 for eventual operation on a mountaintop in Chile.

CCsla10_09_12

Richter, who served as SLAC director from 1984 to 1999, has said that the fast-evolving nature of science necessitates a changing path and pace of research. “Labs can remain on the frontiers of science only if they keep up with the evolution of those frontiers,” remarks Richter. “SLAC has evolved over its first 50 years and is still a world leader in areas beyond what was thought of when it was first built. It is up to the scientists of today to keep it moving and keep it on some perhaps newly discovered frontiers for the next 50.”

This article is based on the one published on the SLAC News Centre.

Stanford University, in California, already has a leading position as far as linear accelerators are concerned. It operates a whole family of linacs, several of which are used for medical purposes. The 200 ft machine [Mark III] in operation there produces 700 MeV electrons and its energy will be stepped up to 1050 MeV.

Late in May, Stanford made the scientific headlines – again with a linac.

Addressing a science research symposium in Manhattan, President Eisenhower announced that he would recommend to the US Congress the financing of a “large new electron linear accelerator … a machine two miles long, by far the largest ever built”.

This machine, intended for Stanford University, would be one of the most spectacular atom smashers ever devised. Two parallel tunnels would have to be driven for two miles into the rock of a small mountain in the vicinity of Palo Alto. Such natural cover would, of course, stop any dangerous radiation. One of the tunnels, the smaller in diameter, would house the accelerator proper, while the bigger one would be used for maintenance purposes.

The proposed new linac for Stanford would initially produce 15 BeV (GeV) electrons; it is announced that this energy could later be raised to 40 BeV. It is believed that the machine would take six years to build, at a cost of 100 million dollars. Approval of the project, now only taken after Congressional hearings, depends on the decision to be held in July.

• From the first issue of CERN Courier.

CEBAF: a fruitful past and a promising future

CCceb1_09_12

On 18 May, the US Department of Energy’s Jefferson Lab shut down its Continuous Electron Beam Accelerator Facility (CEBAF) after a long and highly successful 17-year run, which saw the completion of more than 175 experiments in the exploration of the nature of matter. At 8.17 a.m., Jefferson Lab’s director, Hugh Montgomery, terminated the last 6 GeV beam and Accelerator Division associate director, Andrew Hutton, and director of operations, Arne Freyberger, threw the switches on the superconducting RF zones that power CEBAF’s two linear accelerators. Coming up next – the return of CEBAF, with double the energy and a host of other enhancements designed to delve even deeper into the structure of matter.Jefferson Lab has been preparing for its 12 GeV upgrade of CEBAF for more than a decade. In fact, discussions of CEBAF’s upgrade potential began soon after it became the first large-scale accelerator built with superconducting RF technology. Its unique design features two sections of superconducting linear accelerator, which are joined by magnetic arcs to enable acceleration of a continuous-wave electron beam by multiple passes through the linacs. The final layout took account of CEBAF’s future, allowing extra space for an expansion.

Designed originally as a 4 GeV machine, CEBAF exceeded that target by half as much again to deliver high-luminosity, continuous-wave electron beams at more than 6 GeV to targets in three experimental halls simultaneously. Each beam was fully independent in current, with a dynamic range from picoamps to hundreds of microamps. Exploiting the new technology of gallium-arsenide strained-layer photocathodes provided beam polarizations topping 85%, with sufficient quality for parity-violation experiments.

Inside the nucleon

CEBAF began serving experiments in 1995, bombarding nuclei with the 4 GeV electron beam. Its physics reach soon far outstripped the initial planned experimental programme, which was historically classified in three broad categories: the structure of the nuclear building blocks; the structure of nuclei; and tests of fundamental symmetry.

CCceb2_09_12

Experiments exploring the structure of the proton led to the discovery that its magnetic distribution is more compact than its charge. This surprising result, which contradicted previous data, generated many spin-off experiments and caused a renewed interest in the basic structure of the proton. Other studies confirmed the concept of quark–hadron duality, reinforcing the predicted relationship between these two descriptions of nucleon structure. Other measurements found that the contribution of strange quarks to the properties of the proton is small, which was also something of a surprise.

Turning to the neutron, CEBAF’s experiments made groundbreaking measurements of the distribution of electric charge, which revealed that up quarks congregate towards the centre, with down quarks converging along the periphery. Precision measurements were also made of the neutron’s spin structure for the first time, as Jefferson Lab demonstrated the power of its highly polarized deuteron target and polarized helium-3 target.

Studies conducted with CEBAF revealed new information about the structure of the nucleon in terms of quark flavour, while others measured the excited states of the nucleon and found new states that were long predicted in quark models of the nucleon. High-precision data on the Δ resonance – the first excited state of the proton – demonstrated that its formation is not described by perturbative QCD, as some theorists had proposed. Researchers also used CEBAF to make precise measurements of the charged pion form-factor to probe its distribution of electric charge. New measurements of the lifetime of the neutral pion were also performed to test the low-energy effective field theory of QCD.

Following the development of generalized parton distributions (GPDs), a novel framework for studying nucleon structure, CEBAF provided an early experimental demonstration that they can be measured using high-luminosity electron beams. Following the upgrade, it will be possible to make measurements that can combine GPDs with transverse momentum distribution functions to provide 3D representations of nucleon structure.

From nucleons to nuclei

In its explorations of the structure of nuclei, research with CEBAF bridges the descriptions of nuclear structure from experiments that show the nucleus built of protons and neutrons to those that show the nucleus as being built of quarks. The first high-precision determination of the distribution of charge inside the deuteron (built of one proton and one neutron) at short distances revealed information about how the deuteron’s charge and magnetization – terms related to its quark structure – are arranged.

CCceb3_09_12

Systematic deep-inelastic experiments with CEBAF have shed light on the EMC effect. Discovered by the European Muon collaboration at CERN, this is an unexpected dip in per-nucleon cross-section ratios of heavy-to-light nuclei, which indicates that the quark distributions in heavy nuclei are not simply the sum of those of the constituent protons and neutrons. The CEBAF studies indicated that the effect could be generated by high-density local clusters of nucleons in the nucleus, rather than by the average density.

Related studies provided experimental evidence of nucleons that move so close together in the nucleus that they overlap, with their quarks and gluons interacting with each other in nucleon short-range correlations. Further explorations revealed that neutron–proton short-range correlations are 20 times more common than proton–proton short-range correlations. New experiments planned for the upgraded CEBAF will further probe the interactions of protons, neutrons, quarks and gluons to improve understanding of the origin of the nucleon–nucleon force.

High-precision data from CEBAF are also helping researchers to probe nuclei in other ways. Hypernuclear spectroscopy, which exploits the “strangeness” degree of freedom by introducing a strange quark into nucleons and nuclei, is being used to study the structure and properties of baryons in the nucleus, as well as the structure of nuclei. Also, the recent measurement of the “neutron skin” of lead using parity-violation techniques will be used to constrain the calculations of the fate of neutron stars.

CCceb4_09_12

CEBAF’s highly polarized, high-luminosity, highly stable electron beams have exhibited excellent quality in energy and position. Coupled with the state-of-the-art cryotargets and large-acceptance precision detectors, this has allowed exploration of physics beyond the Standard Model through parity-violating electron-scattering experiments. Currently, the teams are eagerly awaiting the results of analysis of the experimental determination of the weak charge of the proton.

A bright future

Although the era of CEBAF at 6 GeV is over, the future is still bright. Jefferson Lab’s Users Group has swelled to more than 1350 physicists. They are eager to take advantage of the upgraded CEBAF when it comes back online, with 52 experiments – totalling some six years of beam time – already approved by the laboratory’s Program Advisory Committee (Dudek et al. 2012).

CCceb5_09_12

Jefferson Lab is now shut down for installation of the new and upgraded components that are needed to finish the 12 GeV project. At a cost of $310 million, this will enhance the research capabilities of the CEBAF accelerator by doubling its energy and adding an additional experimental hall, as well as by improving the existing halls along with other upgrades and additions.

Preliminary commissioning of an upgrade cryomodule has demonstrated good results. The unit was installed in 2011 and commissioned with a new RF system during CEBAF’s final months of running at 6 GeV. The cryomodule successfully ran at its full specification gradient, 108 MeV, for more than an hour while delivering beam to two experimental halls. Commissioning of the 12 GeV machine is scheduled to commence in November 2013. Beam will be directed first to Hall A and its existing spectrometers, followed by the new experimental facility, Hall D.

Berkeley welcomes real-time enthusiasts

CCrt1_09_12

The IEEE-NPSS Real-Time Conference is devoted to the latest developments in real-time techniques in particle physics, nuclear and astrophysics, plasma physics and nuclear fusion, medical physics, space science, accelerators and general nuclear power and radiation instrumentation. Taking place every second year, it is sponsored by the Computer Application in Nuclear and Plasma Sciences technical committee of the IEEE Nuclear and Plasma Sciences Society (NPSS). This year, the 18th conference in the series, RT2012, was organized by the Lawrence Berkeley National Laboratory (LBNL) under the chair of Sergio Zimmermann and took place on 11–15 June at the Shattuck Plaza Hotel in downtown Berkeley, California.

The conference returned to the US after being held in Lisbon for RT2010 and in Beijing in 2009, when the first Asian conference of this series was held at the Institute for High-Energy Physics. RT2012 attracted 207 registrants, with a large proportion of young researchers and engineers. Following the meetings in Beijing and Lisbon, there is now a significant attendance from Asia, as well as from the fusion and medical communities, making the conference an excellent place to meet real-time specialists with diverse interests from around the world.

Presentations and posters

As in the past, the 2012 conference consisted of plenary oral sessions. This format encourages participants to look at real-time developments in different sectors other than their own and greatly fosters the necessary interdisciplinary exchange of ideas in the various fields. Following a long tradition, each poster session is associated with a “mini-oral” presentation session. Presenters can opt for a two-minute talk, which helps them to emphasize the highlights of their posters. It is also an excellent educational opportunity for young participants to present and promote their work. With a mini-oral presentation still fresh in mind, delegates can then seek out the appropriate author during the following poster session, an approach that stimulates lively and intensive discussions.

The conference began as usual with an opening session with five invited speakers who surveyed hot topics from physics or innovative technical developments. First, David Schlegel of LBNL gave an introduction to the physics of learning about dark energy from the largest galaxy maps. Christopher Marshall of Lawrence Livermore National Laboratory introduced the National Ignition Facility and its integrated computer system. CERN’s Niko Neufeld gave an overview talk on the trigger and data acquisition (DAQ) at the LHC, which provided an introduction to the large number of detailed presentations that followed during the week. Henry Frisch of the University of Chicago presented news from the Large Area Photodetectors project, which aims for submillimetre and subnanosecond resolution in space and time, respectively. Last, Fermilab’s Ted Liu spoke about triggering in high-energy physics, with selected topics for young experimentalists.

The technical programme, organized by Réjean Fontaine of the University of Sherbrook, Canada, brought together various areas of real-time computing applications and DAQ covering a range of topics in various fields. About half of the topics came from high-energy physics, the rest mainly from astrophysics and nuclear fusion, medical applications and accelerators.

Some important sessions, such as that on Data Acquisition and Intelligent Signal Processing, started with an invited introductory or review talk. Ealgoo Kim of Stanford University reviewed the trend of data-path structures for DAQ in positron-emission tomography systems, showing how the electronics and DAQ are similar to those for detectors in high-energy physics. Bruno Gonçalves of the Instituto Superior Técnico Lisbon spoke about trends in controls and DAQ in fusion devices, such as ITER, particularly towards reaching the necessary high availability. Riccardo Paoletti of the University of Siena and INFN Pisa presented the status and perspectives on fast waveform digitizers, with many examples being given in following presentations.

Rapid evolution

This year the conference saw the rapid and systematic evolution of intelligent signal processing as it moves further towards front-end signal processing at the start of the DAQ chain. This incorporates ultrafast analogue and timing converters that use the waveform analysis concept together with powerful digital signal-processing architectures, which are necessary to compress and extract data in real time in a quasi “deadtime-less” process. Read-out systems are now made of programmable devices that include hardware and software techniques and tools for programming the reconfigurable hardware, such as field-programmable gate arrays, graphic processing units (GPUs) and digital signal processors.

An increasing number of applications and projects using new standards

Participants saw the evolution of many new projects that include architectures dealing with fully real-time signal processing, digital data extraction, compression and storage at the front-end, such as the PANDA antiproton-annihilation experiment for the Facility for Antiproton and Ion Research being built at Darmstadt. For the read-out and data-collection systems, the conceptual model is based on fast data transfer, now with multigigabit parallel links from the front-end data buffers up to terabit networks with their associated hardware (routers, switches, etc.). Low-level trigger systems are becoming fully programmable and in some experiments, such as LHCb at CERN, challenging upgrades of the level-0 selection scheme are planned, with trigger processing taking place in real time at large computer farms. There is an ongoing integration of processing farms for high-level triggers and filter farms for online selection of interesting events at the LHC. Experiences with real data were reported at the conference, providing feedback on the improvement of the event selection process.

A survey of control, monitoring and test systems for small and large instruments, as well as new machines – such as the X-ray Free-Electron Laser at DESY – was presented, showing the increasing similarities and possibilities for integration with standard DAQ systems of these instruments. A new track at the conference this year dealt with upgrades of existing systems, mainly related to LHC experiments at CERN and to Belle II at KEK and the SuperB project.

The conference saw an increasing number of applications and projects using new standards, emerging technologies such as Advance Telecommunications Computing Architecture (ATCA), as well as feedback on the experience and lessons learnt from successes and failures. This last topic, in particular, was new at this conference. Rather than showing only great achievements in glossy presentations, it can also be helpful to learn from other people’s difficulties, problems and even mistakes.

CANPS Prize awarded

A highlight of the Real-Time conference is the presentation of the CANPS prize, which is given to individuals who have made outstanding contributions in the application of computers in nuclear and plasma sciences. This year the award went to Christopher Parkman, now retired from CERN, for the “outstanding development and user support of modular electronics for the instrumentation in physics applications”. Special efforts were also made to stimulate student contributions and awards were given for the three best student papers, selected by a committee chaired by Michael Levine of Brookhaven National Laboratory.

Last, an industrial exhibit by a few relevant companies ran through the week (CAEN, National Instruments, Schroff, Struck, Wiener and ZNYX). There was also the traditional two-day workshop on ATCA and MicroTCA, which is the latest DAQ standard, following CAMAC, Fastbus and VME, from the telecommunications industry. This workshop with tutorials, organized by Ray Larsen and Zheqiao Geng of SLAC and Sergio Zimmermann of LBNL, took place during the weekend before the conference. Two short courses were also held that same weekend, one by Mariano Ruiz of the Technical University of Madrid on DAQ systems and one by Hemant Shukla of LBNL on data analysis with fast graphic cards (GPUs).

The 19th Real-Time Conference will take place in May 2014 in the deer park inside the city of Nara, Japan. It will be organized jointly by KEK, the University of Osaka and RIKEN under the chair of Masaharu Nomachi. A one-week Asian Summer school on advanced techniques on electronics, trigger, DAQ and read-out systems will also be organized jointly with the conference.

• More details about the Real-Time Conference. A special edition of IEEE Transactions on Nuclear Sciences will include all eligible contributions from the RT2012 conference, with Sascha Schmeling of CERN as senior editor.

Charting the future of European particle physics

Tatsuya Nakada

The original CERN convention, which was drafted nearly 60 years ago, foresaw that the organization should have a role as co-ordinator for European particle physics, as well as operating international accelerator laboratories. Today, this role is more appropriate than ever: the long lead times usually required to prepare and construct facilities and experiments for modern high-energy physics, together with the increased costs for these activities, underlie the need for a general European strategy in the field. So it was natural for CERN Council to initiate the creation of a European Strategy for Particle Physics in June 2005 and to establish dedicated groups for reviewing the scientific status and producing a proposal. They consulted widely with the community, funding agencies and policy makers in preparing the strategy document, which was adopted by Council in July 2006 during a dedicated session in Lisbon.

The strategy consists of 17 concise descriptions, with action statements. It addresses not only scientific issues but also subjects such as the organization and social relevance of high-energy physics. The highest priority on the scientific programme was given to the LHC, followed by accelerator R&D for possible future high-energy machines, including the luminosity and energy upgrades of the LHC, linear e+e colliders and neutrino facilities.

Breakthroughs in science can emerge from unexpected corners

CERN Council adopted this strategy in 2006 with an understanding that it be brought up to date at intervals of typically five years. The first update is now being prepared for presentation to Council in 2013, the process having been postponed for two years to wait for data from the LHC at energies of 7 and 8 TeV in the centre of mass. As a result, in addition to the recent discovery at the LHC of a new boson that is compatible with the Standard Model Higgs particle, the third mixing angle of the neutrino mass-matrix has become known through experiments elsewhere.

These new results generate more scientific questions compared with 2006, such as:

• How far can the properties of the Higgs(-like) particle be explored at the LHC, with the 300 fb–1 of data expected for Phase 1, and with the 1000–3000 fb–1 (1–3 ab–1) that the high-luminosity upgrade should yield? Do we need other machines to study the particle’s properties? If so, after taking into account factors such as the technical maturity, energy expandability, cost and location, what is the optimal machine: a linear or circular e+e collider, a photon collider or a muon collider? As a more concrete question, what should the European reaction be towards the linear collider that is being considered in Japan?

• The European neutrino community is putting forward a short-baseline neutrino programme to search for sterile neutrinos, as well as a long-baseline one to measure neutrino-mass mixing parameters, to take place in Europe. In addition, R&D studies are underway for a “neutrino factory” as an eventual facility. But, what should the European neutrino programme be, and where does the global aspect start to play a role?

• What are the options for a future machine in Europe after the LHC? Will this be a machine to address physics at the 10 TeV energy scale? Will data from the LHC at the full design energy provide enough justification for this? When will be the right moment to take a decision, and what kind of R&D must be done to be ready for such a decision in the future?

Breakthroughs in science can emerge from unexpected corners. Therefore, the strategy must also have some flexibility to allow the fostering of unconventional ideas.

The process of updating the European strategy began formally in the summer of 2011 when Council set up a new European Strategy Group, which is assisted by the European Strategy Preparatory Group for scientific matters in preparing the proposal for the update. As with the process that led to the original strategy, the proposal will be based on the maximum input from the particle-physics community, as well as from other stakeholders – both inside and outside Europe. An important part of this consultation process was the Open Symposium recently held in Krakow, where the community expressed their opinions on the subjects outlined above, as well as on flavour physics, strong-interaction physics, non-accelerator-based particle physics and theoretical physics. Issues important for carrying out the research programme, such as accelerator science, detector R&D, computing and infrastructure for large detector construction, were also addressed. The meeting demonstrated that there is an emerging consensus that new physics must be studied both by direct searches at the highest-energy accelerator possible, as well as by precision experiments with and without accelerators.

The Preparatory Group is in the process of producing a summary document on the scientific status. The European Strategy Group will meet in January 2013 in Erice to draft the updated strategy – which must also take global aspects into account – for discussion by CERN Council in March. The aim is that Council will adopt the updated strategy during a special session to be held in Brussels in May.

• Further information on the update of the European Strategy of Particle Physics may be found at https://europeanstrategygroup.web.cern.ch/EuropeanStrategyGroup/.

The Quantum Exodus: Jewish Fugitives, the Atomic Bomb, and the Holocaust

By Gordon Fraser
Oxford University Press
Hardback: £25

CCboo1_08_12

Don’t be misled by the title of this book. It contains a surprising amount of information, much more than focusing on the exodus of Jewish scientists from Germany after the rise of the Nazi Party. The book puts anti-Semitism into a broad historical perspective, starting with the destruction of the Temple in Jerusalem, the expelling of the Jews all across Europe and the growth of a mild and sometimes hidden anti-Semitism. This existed in Germany in the 19th century and even to some extent under the Nazis, when the initial objective was to cleanse German culture of all non-Aryan influences. However, various phases led eventually to the Holocaust. A political spark was ignited when the parliamentary building in Berlin went up in flames in February 1933 and Adolf Hitler became Chancellor. The Civil Service Law was soon introduced that forbade Jews from being employed by the state, followed by the burning of books and the Kristallnacht, during which Jewish shops were destroyed – all of which were further steps towards the “final solution”.

In parallel to these political developments, Quantum Exodus describes the rise of quantum physics in Germany during the 19th century, with protagonists such as Alexander von Humboldt, Wilhelm Röntgen, Hermann von Helmholtz, Max Planck, Walther Nernst and Arnold Sommerfeld. They attracted many Jewish scientists from all over Europe, among them Hans Bethe, Max Born, Peter Debye, Albert Einstein, Lise Meitner, Leó Szilárd, Edward Teller and Eugene Wigner, who went on to become key players in 20th-century physics. Most of them left Germany, some at an early time, others escaping at the last moment and most of them going to the UK or US, often via Denmark, with Niels Bohr’s institute as a temporary shelter. An exodus also started from other countries such as Austria and Italy. The book recounts the adventurous and disheartening fates of many of these physicists. Arriving as refugees, they were initially often considered aliens and during the war sometimes even as spies. The author gives some spice to his narrative by adding amusing details from the private lives of some of the protagonists.

A detailed account is given of the Manhattan Project project and how the famous letter by Einstein to President Franklin Roosevelt initiated the building of the fission bomb. It was written as a result of pressure by Szilárd, the main mover behind the scenes. What is less known is the primordial importance of a paper by Otto Frisch and Rudolf Peierls in the UK, which already contained the detailed ideas of the fission bomb. Robert Oppenheimer, an American Jew, became scientific director of the Manhattan Project after his studies in Europe, bringing the European mindset to the US. He attracted many émigrés to the project, such as Bethe, Teller, Felix Bloch and Victor Weisskopf. The book relates vividly how Teller, because of his stubborn character, could not be well integrated into this project; rather, he pushed in parallel for the H-bomb.

The author implies, although somewhat indirectly, that the rise of Nazism and the development of the nuclear bomb have a deeper correlation, without giving convincing details. However, the interaction of science (its stars) and politics is well described. Bohr’s influence, although at the centre of nuclear physics, was limited – partly because of his mumbling and bad English (something that I witnessed at the Geneva Atoms for Peace Conference in 1957, where his allocution in English had to be translated simultaneously into English.)

Many of the exiled physicists who worked on the Manhattan Project developed considerable remorse after the events of Hiroshima and Nagasaki. When I invited Isidor Rabi to speak at the 30th anniversary of CERN he considered his involvement in the foundation of CERN as a kind of recompense for his wartime activities.

The descriptive account of science in the US and Europe after the Second World War is interesting. In the US, politicians’ interest in science decreased substantially and a change was introduced only when the shock of Sputnik led eventually to the “space race”. Basic science also benefited from this change, leading for example to the foundation of various national laboratories such as Fermilab. In Europe, a new stage for science emerged when a pan-European centre to provide resources on a continental rather than a national scale was proposed and CERN was founded in 1954.

The book benefits from the fact that the author is competent in physics, which he sometimes describes poetically, but never wrongly. He has done extremely careful research, giving many references and a long list of Jewish emigrants. I found few points to criticise. Minor objections concern passages about CERN, although the author knows the organization so well. For example, the response of CERN towards the Superconducting Super Collider was the final choice of the circumference of the LEP tunnel (27 km) in view of the possibility of a later proton–proton or proton–electron collider in the same tunnel, while the definite LHC proposal came only in 1987; and the LHC magnets are superconducting to achieve the necessary high magnetic fields and not so much to save electricity.

The various chapters are not written in chronological order, and political or scientific developments are integrated with human destinies. This assures easy and entertaining reading. Like me, older readers who have known many of the protagonists, will not avoid poignant emotions. For young readers, the book is recommended because they will learn many historical facts that should not be forgotten.

One intriguing question (probably unanswerable) that was not considered, is: what would have happened to US science without the contribution of Jewish immigrants?

Deflector shields protect the lunar surface

CCnew1_08_12

The origin of the enigmatic “lunar swirls” – patches of relatively pale lunar soil, some measuring several tens of kilometres across – has been an unresolved mystery since the mid-1960s, when NASA’s Lunar Orbiter spacecraft mapped the surface of the Moon in preparation for the Apollo landings. Now, a team of physicists has used a combination of satellite data, plasma-physics theory and laboratory experiments to show how the features can arise when the hot plasma of the solar wind is deflected around “mini-magnetospheres” associated with magnetic anomalies at the surface of the Moon.

Initially thought to be smeared-out craters, close-range photographs from Lunar Orbiter II showed that at least one large swirl – named Reiner Gamma, after the nearby Reiner impact crater – could not be a crater. Studies from subsequent Apollo missions revealed that the swirls are associated with localized magnetic fields in the lunar crust. Because the Moon today has no overall magnetic field, these “magnetic anomalies” seem to be remnants of a field that has existed in the past.

In 1998–1999, the Lunar Prospector mission discovered that the magnetic anomalies create miniature magnetospheres above the Moon’s surface, just as the Earth’s planetary magnetic field does on a much larger scale when it deflects the charged particles of the solar wind around the planet. Could the mini-magnetospheres on the Moon, which are only a few hundred kilometres in size, somehow shield the crust from the solar wind and so prevent the surface from darkening as a result of constant bombardment by incoming particles?

CCnew2_08_12

One problem with this idea has been that the magnetic fields – in the order of nanotesla – seem to be too weak to affect the energetic particles of the solar wind on the scales observed. However, a team led by Ruth Bamford of the Rutherford Appleton Laboratory and York University has shown that it is the electric field associated with the shock formed when the solar wind interacts with the magnetic field that deflects the particles bombarding the Moon.

Data from various lunar-orbiting spacecraft suggested a picture in which the solar wind is deflected round a magnetic “bubble”, creating the effect of a cavity within the plasma density that is enclosed by a “skin” that is only kilometres thick. This skin effectively reflects incoming protons, increasing their energy.

To explain these observations, Bamford and colleagues invoke a two-fluid model of the plasma, with unmagnetized ions and magnetized electrons. The electrons are slowed down and deflected by the magnetic barrier that forms when the magnetic field of the solar wind encounters the magnetic anomaly – but the much heavier ions do not respond so quickly. This leads to a separation in space-charge and hence an electric field.

The team confirmed the principle of their theoretical model by using a plasma wind tunnel with a supersonic stream of hydrogen plasma and the dipole field of a magnet. The experiment showed that the plasma particles were indeed “corralled” by a narrow electrostatic field to form a cavity in the plasma, so protecting areas of the surface towards which the particles were flowing. Translated to the more irregular magnetic fields on the lunar surface, with a range of overlapping cavities, this can provide the long-awaited explanation of the light and dark patterns – protected and unprotected regions, respectively – that make up the swirls.

Baryon oscillation spectra for all

CCnew3_08_12

By professional astronomy standards, the 2.5 m telescope at Apache Point Observatory is quite small. More than 50 research telescopes are larger and many are located at much better sites. Apache Point Observatory is also a little too close to city lights – the atmospheric turbulence that dominates the sharpness of focus is about two times worse than at the best sites on Earth – and summer monsoons shut down the observatory for two months each year.

Yet, the Sloan Digital Sky Survey (SDSS), using this telescope, has produced the most highly cited data set in the history of astronomy (Trimble and Ceja 2008; Madrid and Macchetto 2009). Its success is rooted in the combination of high-quality, multipurpose data and open access for everyone: SDSS has obtained 5-filter images of about a quarter of the sky, spectra of 2.4 million objects and has made them publicly available on a yearly basis, even as the survey continues.

SDSS-III launched its ninth data release (DR9) on 31 July. This is the first release to include data from the upgraded spectrographs of the Baryon Oscillation Spectroscopic Survey (BOSS) – the largest of the four subsurveys of SDSS-III. By measuring more distant galaxies, these spectra probe a larger volume of the universe than all previous surveys combined.

BOSS has already published its flagship measurement of baryon acoustic oscillations (BAO) to constrain dark energy using these data (Anderson et al. 2012). BAO are the leftover imprint of primordial matter-density fluctuations that froze out as the universe expanded, leaving correlations in the distances between galaxies. The size scale of these correlations acts as a “standard ruler” to measure the expansion of the universe, complementing the “standard candles” of Type Ia supernovae that led to the discovery of the accelerating expansion of the universe.

Another major BOSS analysis using these data is still in progress. In principle, BAO can also be measured by using bright, distant quasars as backlights and measuring the “Lyman alpha forest” absorption in the spectra as intervening neutral hydrogen absorbs the quasars’ light. The wavelength of the absorption traces the red shift of the hydrogen and the amount of absorption traces its density. Thus, this also measures the structure of matter – including BAO – but at much further distances than is possible with galaxies. BOSS has the first data set with enough quasars to make this measurement and the collaboration is nearing completion of this analysis. However, the final results are not yet published and now the data are public for anyone else to try this.

Are there any surprises in the results? Not yet. BOSS has the most accurate BAO measurements yet, with distances measured to 1.7%, but the results are consistent with the “ΛCDM” cosmological standard model, which includes a dark-energy cosmological constant (Λ) and cold dark matter (CDM). But DR9 contains only about a third of the full BOSS survey and BOSS has already finished observations for data release 10 (DR10), due to be released in July 2013. DR10 will also include the first data from APOGEE, another SDSS-III subsurvey that probes the dynamical structure and chemical history of the Milky Way.

Summer running at the LHC

The LHC has delivered more than twice as many collisions to the ATLAS and CMS experiments this year as it did in all of 2011. On 4 August, the integrated luminosity recorded by each of the experiments passed the 10 fb–1 mark. Last year, they each recorded data corresponding to around 5.6 fb–1. On 22 August this year, the more specialized LHCb experiment passed 1.11 fb–1, the same as its entire data sample for 2011.

The LHC’s peak luminosity had been running 5–10% lower following June’s technical stop. This was mainly owing to a slight degradation in beam quality from the injectors – an issue that was resolved at the beginning of August. The LHC had also been suffering from occasional beam instabilities, which have resulted in significant beam losses. A solution to this second problem lay in finding new optimum machine settings with the polarity of octupole magnets reversed relative to that of recent years. (The octupole magnets are used to correct beam instabilities.)

This reversal, accompanied by an adjustment of the settings in the sextupole magnets, was studied over several days in August. These changes paid off and the beams became more stable when brought into collision, so the bunch currents could be increased from 1.5 × 1011 to 1.6 × 1011 protons per bunch. With this increased bunch intensity, the peak luminosity in ATLAS and CMS reached more than 7.5 × 1033 cm–2 s–1, compared with the maximum peak luminosity of 3.6 × 1033 cm–2 s–1 in 2011.

In addition, successful commissioning of injection and RF-capture using new Super Proton Synchrotron optics (called Q20 optics) has opened the way for even higher bunch intensities. This new optics system has yet to be used operationally.

During the summer runs, the machine regularly enjoyed long fills in the 12- to 15-hour range. This showed the benefits of the extensive consolidation work to mitigate the effects of radiation to electronics in the LHC tunnel and the continuing efforts to improve overall reliability. The LHC is well on its way towards its goal of delivering in the order of 15 fb–1 in 2012. Indeed, at the beginning of September, CMS and ATLAS had already recorded more than 13 fb–1.

Illuminating extra dimensions with photons

Photons are a critical tool at the LHC, and the ATLAS detector has been carefully designed to measure photons precisely. In addition to playing a central role in the recent discovery of a new particle resembling the Higgs boson, final states with photons are used both to make sensitive tests of the Standard Model and to search for physics beyond it.

Recent results from the ATLAS experiment using the full 2011 data set are shining new light – in more than one sense – on theoretical models that propose the existence of extra dimensions. In these models, which were originally inspired by string theory, the extra dimensions are “compactified” – finite in extent, they are curled up on themselves and so small that they have not yet been observed. Such models could answer a major mystery in particle physics, namely the weakness of gravity as compared with the other forces. The basic idea is that gravity’s influence could be diluted by the presence of the extra dimensions. Different variants of these models exist, with corresponding differences in how they could be detected experimentally.

CCnew5_08_12

Events with two energetic photons provide a good place to search. In the Randall-Sundrum (RS) models of extra dimensions, a new heavy particle could decay to a pair of photons. A plot of the diphoton mass should then reveal a narrow peak above the smooth background expected from Standard Model backgrounds. In Arkani-Hamed-Dimopoulos-Dvali (ADD) models, on the other hand, the influence of extra dimensions should lead to a broad excess of events with large diphoton masses.

The figure shows the diphoton mass spectrum measured by ATLAS. The Standard Model background expectation has been superimposed, as have contributions expected for examples of RS or ADD signals. The data agree well with the background expectation and provide stringent constraints on the extra-dimension models. For instance, the mass of the RS graviton must be larger than 1–2 TeV, depending on the strength of the graviton’s couplings to Standard Model particles.

ADD models can also be probed via the single-photon final state. The ATLAS collaboration has searched for single photons accompanied by a large apparent imbalance in the energy measured in the event, which would result from a particle escaping into the extra dimensions and taking its energy with it. The ATLAS analysis found a total number of such events in agreement with the expectation for the small Standard Model backgrounds. The final result, therefore, was used to establish new constraints on the fundamental scale parameter MD of the so-called ADD Large Extra Dimension (LED) model. The lower limits set on the scale, which improve on previous limits, lie in the range 1.74–1.87 TeV, depending upon the number of extra dimensions.

As expected, photons are proving to be an extremely useful probe for new physics at the LHC, providing important tests of many models. With the higher LHC energy in 2012 and the larger data set being accumulated, photon analyses will continue to provide an ever greater potential for discovery.

bright-rec iop pub iop-science physcis connect