Bluefors – leaderboard other pages

Topics

The founding of Fermilab

In 1960, two high-energy physics laboratories were competing for scientific discoveries. The first was Brookhaven National Laboratory on Long Island in New York, US, with its 33 GeV Alternating Gradient Synchrotron (AGS). The second was CERN in Switzerland, with its 28 GeV Proton Synchrotron (PS). That year, the US Atomic Energy Commission (AEC) received several proposals to boost the country’s research programme focusing on the construction of new accelerators with energies between 100–1000 GeV. A joint panel of president Kennedy’s Presidential Science Advisory Committee and the AEC’s General Advisory Committee was formed to consider the submissions, chaired by Harvard physicist and Manhattan Project veteran Norman Ramsey. By May 1963, the panel had decided to have Ernest Lawrence’s Radiation Laboratory in Berkeley, California, design a several-hundred GeV accelerator. The result was a 200 GeV synchrotron costing approximately $340 million.

CCfer2_05_17

When Cornell physicist Robert Rathbun Wilson, a student of Lawrence’s who also worked on the Manhattan Project, saw Berkeley’s plans he considered them too conservative, unimaginative and too expensive. Wilson, being a modest yet proud man, thought he could design a better accelerator for less money and let his thoughts be known. By September 1965, Wilson had proposed an alternative, innovative, less costly (approximately $250 million) design for the 200 GeV accelerator to the AEC. The Joint Committee on Atomic Energy, the congressional body responsible for AEC projects and budgets, approved of his plan.

During this period, coinciding with the Vietnam war, the US Congress hoped to contain costs. Yet physicists hoped to make breakthrough discoveries, and thought it important to appeal to national interests. The discovery of the Ω particle at Brookhaven in 1964 led high-energy physicists to conclude that “an accelerator ‘in the range of 200–1000 BeV’ would ‘certainly be crucial’ in exploring the ‘detailed dynamics of this strong SU(3) symmetrical interaction’.” Simultaneously, physicists were expressing frustration with the geographic situation of US high-energy physics facilities. East and West Coast laboratories like Lawrence Berkeley Laboratory and Brookhaven did not offer sufficient opportunity for the nation’s experimental physicists to pursue their research. Managed by regional boards, the programmes at these two labs were directed by and accessible to physicists from nearby universities. Without substantial federal support, other major research universities struggled to compete with these regional laboratories.

CCfer3_05_17

Against this backdrop arose a major movement to accommodate physicists in the centre of the country and offer more equal access. Columbia University experimental physicist Leon Lederman championed “the truly national laboratory” that would allow any qualifying proposal to be conducted at a national, rather than a regional, facility. In 1965, a consortium of major US research universities, Universities Research Association (URA), Inc., was established to manage and operate the 200 GeV accelerator laboratory for the AEC (and its successor agencies the Energy Research and Development Administration (ERDA) and the Department of Energy (DOE)) and address the need for a more national laboratory. Ramsey was president of URA for most of the period 1966 to 1981.

Following a nationwide competition organised by the National Academy of Sciences, in December 1966 a 6800 acre site in Weston, Illinois, around 50 km west of Chicago, was selected. Another suburban Chicago site, north of Weston in affluent South Barrington, had withdrawn when local residents “feared that the influx of physicists would ‘disturb the moral fibre of their community’”. Robert Wilson was selected to direct the new 200 GeV accelerator, named the National Accelerator Laboratory (NAL). Wilson asked Edwin Goldwasser, an experimental physicist from the University of Illinois, Urbana-Champaign, and member of Ramsey’s panel, to be his deputy director and the pair set up temporary offices in Oak Brook, Illinois, on 15 June 1967. They began to recruit physicists from around the country to staff the new facility and design the 200 GeV accelerator, also attracting personnel from Chicago and its suburbs. President Lyndon Johnson signed the bill authorising funding for the National Accelerator Laboratory on 21 November 1967.

Chicago calling

CCfer4_05_17

It wasn’t easy to recruit scientific staff to the new laboratory in open cornfields and farmland with few cultural amenities. That picture lies in stark contrast to today, with the lab encircled by suburban sprawl encouraged by highway construction and development of a high-tech corridor with neighbours including Bell Labs/AT&T and Amoco. Wilson encouraged people to join him in his challenge, promising higher energy and more experimental capability than originally planned. He and his wife, Jane, imbued the new laboratory with enthusiasm and hospitality, just as they had experienced in the isolated setting of wartime-era Los Alamos while Wilson carried out his work on the Manhattan Project.

Wilson and Goldwasser worked on the social conscience of the laboratory and in March 1968, a time of racial unrest in the US, they released a policy statement on human rights. They intended to: “seek the achievement of its scientific goals within a framework of equal employment opportunity and of a deep dedication to the fundamental tenets of human rights and dignity…The formation of the Laboratory shall be a positive force…toward open housing…[and] make a real contribution toward providing employment opportunities for minority groups…Special opportunity must be provided to the educationally deprived…to exploit their inherent potential to contribute to and to benefit from the development of our Laboratory. Prejudice has no place in the pursuit of knowledge…It is essential that the Laboratory provide an environment in which both its staff and its visitors can live and work with pride and dignity. In any conflict between technical expediency and human rights we shall stand firmly on the side of human rights. This stand is taken because of, rather than in spite of, a dedication to science.” Wilson and Goldwasser brought inner-city youth out to the suburbs for employment, training them for many technical jobs. Congress supported this effort and was pleased to recognise it during the civil-rights movement of the late 1960s. Its affirmative spirit endures today.

CCfer5_05_17

When asked by a congressional committee authorising funding for NAL in April 1969 about the value of the research to be conducted at NAL, and if it would contribute to national defence, Wilson famously answered: “It has only to do with the respect with which we regard one another, the dignity of men, our love of culture…It has to do with, are we good painters, good sculptors, great poets? I mean all the things we really venerate and honour in our country and are patriotic about. It has nothing to do directly with defending our country except to help make it worth defending.”

A harmonious whole

Wilson, who had promised to complete his project on time and under budget, perceived of the new laboratory as a beautiful, harmonious whole. He felt that science, technology, and art are importantly connected, and brought a graphic artist, Angela Gonzales, with him from Cornell to give the laboratory site and its publications a distinctive aesthetic. He had his engineers work with a Berkeley colleague, William Brobeck, and an architectural-engineering group, DUSAF, to make designs and cost estimates for early submissions to the AEC, in time for their submissions to the congressional committees that controlled NAL’s budget. Wilson appreciated frugality and minimal design, but also tried to leave room for improvements and innovation. He thought design should be ongoing, with changes implemented as they are demonstrated, before they became conservative.

CCfer6_05_17

There were many decisions to be made in creating the laboratory Wilson envisioned. Many had to be modified, but this was part of his approach: “I came to understand that a poor decision was usually better than no decision at all, for if a necessary decision was not made, then the whole effort would just wallow – and, after all, a bad decision could be corrected later on,” he wrote in 1987. An example was the magnets in the Main Ring, the first name of the 200 GeV synchrotron accelerator, which had to be redesigned as did the plans for the layout of the experimental areas. Even the design of the distinctive Central Laboratory building, constructed after the accelerator achieved its design energy and renamed Robert Rathbun Wilson Hall in 1980, had to have certain adjustments from its initial concepts. Wilson said that “a building does not have to be ugly to be inexpensive” and he orchestrated a competition among his selected architects to create the final design of this visually striking structure. To save money he set up competitions between contractors so that the fastest to finish a satisfactory project were rewarded with more jobs. Consequently, the Main Ring was completed on time by 30 March 1972 and under the $250 million budget. NAL was dedicated and renamed Fermilab on 11 May 1974.

International attraction

Experimentalists from Europe and Asia flocked to propose research at the new frontier facility in the US, forging larger collaborations with American colleagues. Its forefront position and philosophy attracted the top physicists of the world, with Russian physicists making news working on the first approved experiment at Fermilab in the height of the Cold War. Congress was pleased and the scientists were overjoyed with more experimental areas than originally planned and with higher energy, as the magnets were improved to attain 400 GeV and 500 GeV within two years. The higher energy in a fixed-target accelerator complex allowed more innovative experiments, in particular enabling the discovery of the bottom quark in 1977.

CCfer7_05_17

Fermilab’s early intellectual environment was influenced by theoretical physicists Robert Serber, Sam Trieman, J D Jackson and Ben Lee, who later brought Chris Quigg and Bill Bardeen, who in turn invited many distinguished visitors to add to the creative milieu of the laboratory. Already on Wilson’s mind was a colliding-beams accelerator he called an “energy doubler”, which would employ superconductivity, and he had established working groups to study the idea. But Wilson encountered budget conflicts with the AEC’s successor, the new Department of Energy, which led to his resignation in 1978. He joined the faculties of the University of Chicago and Columbia University briefly before returning to Cornell in 1982.

Fermilab’s future was destined to move forward with Wilson’s ideas of superconducting-magnet technology, and a new director was sought. Lederman, who was spokesperson of the Fermilab study that discovered the bottom quark, accepted the position in late 1978 and immediately set out to win support for Wilson’s energy doubler. An accomplished scientific spokesman, Lederman achieved the necessary funding by 1979 and promoted the energy-enhancing idea of introducing an antiproton source to the accelerator complex to enable proton–antiproton collisions. Experts from Brookhaven and CERN, as well as the former USSR, shared ideas with Fermilab physicists to bring superconducting-magnet technology to fruition at Fermilab. Under the leadership of Helen Edwards, Richard Lundy, Rich Orr and Alvin Tollestrup, the Main Ring evolved into the energy doubler/saver in 1983 with a new ring of superconducting magnets installed below the early Main Ring magnets. This led to a trailblazing era during which Fermilab’s accelerator complex, now called the Tevatron, would lead the world in high-energy physics experiments. By 1985 the Tevatron had achieved 800 GeV in fixed-target experiments and 1.6 TeV in colliding-beam experiments, and by the time of its closure in 2011 it had reached 1.96 TeV in the centre of mass – just shy of its original goal of 2 TeV.

CCfer8_05_17

Theory also thrived at Fermilab in this period. Lederman had brought James Bjorken to Fermilab’s theoretical physics group in 1980 and a theoretical astrophysics group founded by Rocky Kolb and Michael Turner was added to Fermilab’s research division in 1983 to address research at the intersection of particle physics and cosmology. Lederman also expanded the laboratory’s mission to include science education, offering programmes to local high-school students and teachers, and in 1980 opened the first children’s centre for employees of any DOE facility. He founded the Illinois Mathematics and Science Academy in 1985 and the Chicago Teachers Academy for Mathematics and Science in 1990, and the Lederman Science Education Center on the Fermilab site is named after him. Lederman also reached out to many regions including Latin America and partnered with businesses to support the lab’s research and encourage technology transfer. The latter included Wilson’s early Fermilab initiative of neutron therapy for certain cancers, which later would see Fermilab build the 70–250 MeV proton synchrotron for the Loma Linda Medical Center in California.

Scientifically, the target in this period was the top quark. Fermilab and CERN had planned for a decade to detect the elusive top, with Fermilab deploying two large international experimental teams at the Tevatron – CDF (founded by Tollestrup) and DZero (founded by Paul Grannis) – from 1976 to 1995. In 1988 Lederman shared the Nobel prize for the discovery of the muon neutrino at Brookhaven 25 years previously, and in 1989 he stepped down as Fermilab director and joined the faculty of the University of Chicago and later the Illinois Institute of Technology.

Lederman was succeeded by John Peoples, a machine builder and Fermilab experimentalist since 1970, and leader of the Fermilab antiproton source from 1981 to 1985. Peoples had his hands full not only with Fermilab and its research programme but also with the Superconducting Super Collider (SSC) laboratory in Texas. In 1993 the SSC was cancelled and Peoples was asked by the DOE to close down the project and its many contracts. The only person to direct two national laboratories at the same time, Peoples successfully managed both tasks and returned to Fermilab to see the discovery of the top quark in 1995. He had also launched the luminosity-enhancing upgrade to the Tevatron, the Main Injector, in 1999. Peoples stepped down as laboratory director that summer and became director of the Sloan Digital Sky Survey (SDSS) – Fermilab’s first astrophysics experiment. He later directed the Dark Energy Survey and in 2010 he retired, continuing to serve as director emeritus of the laboratory.

Intense future

CCfer9_05_17

In 1999, experimentalist and former Fermilab user Michael Witherell of the University of California at Santa Barbara became Fermilab’s fourth director. Ongoing fixed-target and colliding-beam experiments continued under Witherell, as did the SDSS and the Pierre Auger cosmic ray experiments, and the neutrino programme with the Main Injector. Mirroring the spirt of US–European competition of the 1960s, this period saw CERN begin construction of the Large Hadron Collider (LHC) to search for the Higgs boson at a lower energy than the cancelled SSC. Accordingly, the luminosity of the Tevatron became a priority, as did discussions about a possible future international linear collider. After launching the Neutrinos at the Main Injector (NuMI) research programme, including sending the underground particle beam off-site to the MINOS detector in Minnesota, Witherell returned to Santa Barbara in 2005 and in 2016 he became director of the Lawrence Berkeley Laboratory.

Physicist Piermaria Oddone from Lawrence Berkeley Laboratory became Fermilab’s fifth director in 2005. He pursued the renewal of the Tevatron in order to exploit the intensity frontier and explore new physics with a plan called “Project X”, part of the “Proton Improvement Plan”. Yet the last decade has been a challenging time for Fermilab, with budget cuts, reductions in staff and a redefinition of its mission. The CDF and DZero collaborations continued their search for the Higgs boson, narrowing the region where it could exist, but the more energetic LHC always had the upper hand. In the aftermath of the global economic crisis of 2008, as the LHC approached switch-on, Oddone oversaw the shutdown of the Tevatron in 2011. A Remote Operations Center in Wilson Hall and a special US Observer agreement allowed Fermilab physicists to co-operate with CERN on LHC research and participate in the CMS experiment. The Higgs boson was duly discovered at CERN in 2012 and Oddone retired the following year.

CCfer10_05_17

Under its sixth director, former Fermilab user and director of TRIUMF laboratory in Vancouver, Nigel Lockyer, Fermilab now looks ahead to shine once more through continued exploration of the intensity frontier and understanding the properties of neutrinos. In the next few years, Fermilab’s Long-Baseline Neutrino Facility (LBNF) will send neutrinos to the underground DUNE experiment 1300 km away in South Dakota, prototype detectors for which are currently being built at CERN. Meanwhile, Fermilab’s Short-Baseline Neutrino programme has just taken delivery of the 760 tonne cryostat for its ICARUS experiment after its recent refurbishment at CERN, while a major experiment called Muon g-2 is about to take its first results. This suite of experiments, with co-operation with CERN and other international labs, puts Fermilab at the leading edge of the intensity frontier and continues Wilson’s dreams of exploration and discovery.

Search for sterile neutrinos triples up

CCsbn1_05_17

This summer, two 270 m3 steel containment vessels are making their way by land, sea and river from CERN in Europe to Fermilab in the US, a journey that will take five weeks. Each vessel houses one of the 27,000-channel precision wire chambers of the ICARUS detector, which uses advanced liquid-argon technology to detect neutrinos. Having already operated successfully in the CERN to Gran Sasso neutrino beam from 2010 to 2012, and spent the past two years being refurbished at CERN, ICARUS will team up with two similar detectors at Fermilab to deliver a new physics opportunity: the ability to resolve some intriguing experimental anomalies in neutrino physics and perform the most sensitive search to date for eV-scale sterile neutrinos. This new endeavour, comprised of three large liquid-argon detectors (SBND, MicroBooNE and ICARUS) sitting in a single intense neutrino beam at Fermilab, is known as the Short-Baseline Neutrino (SBN) programme.

The sterile neutrino is a hypothetical particle, originally introduced by Bruno Pontecorvo in 1967, which doesn’t experience any of the known forces of the Standard Model. Sterile-neutrino states, if they exist, are not directly observable since they don’t interact with ordinary matter, but the phenomenon of neutrino oscillations provides us with a powerful probe of physics beyond the Standard Model. Active–sterile mixing, just like standard three-neutrino mixing, could generate additional oscillations among the standard neutrino flavours but at wavelengths that are distinct from the now well-measured “solar” and “atmospheric” oscillation effects. Anomalies exist in the data of past neutrino experiments that present intriguing hints of possible new physics. We now require precise follow-up experiments to either confirm or rule out the existence of additional, sterile-neutrino states.    

On the scent of sterile states

The discovery nearly two decades ago of neutrino-flavour oscillations led to the realisation that each of the familiar flavours (νe, νμ, ντ ) is actually a linear superposition of states of distinct masses (ν1, ν2, ν3 ). The wavelength of an oscillation is determined by the difference in the squared masses of the participating mass states, m2i – m2j. The discoveries that were awarded the 2015 Nobel Prize in Physics correspond to the atmospheric mass-splitting Δm2ATM = |m23– m22| = 2.5 × 10–3 eV2 and the solar mass-splitting Δm2SOLAR = m22 – m21 = 7.5 × 10–5 eV2, so-named because of how they were first observed. Any additional and mostly sterile mass states, therefore, could generate a unique oscillation driven by a new mass scale in the neutrino sector: m2mostly sterile – m2mostly active.

The most significant experimental hint of new physics comes from the LSND experiment performed at the Los Alamos National Laboratory in the 1990s, which observed a 3.8σ excess of electron antineutrinos appearing in a mostly muon antineutrino beam in a region where standard mixing would predict no significant effect. Later, in the 2000s, the MiniBooNE experiment at Fermilab found excesses of both electron neutrinos and electron antineutrinos, although there is some tension with the original LSND observation. Other hints come from the apparent anomalous disappearance of electron antineutrinos over baselines less than a few hundred metres at nuclear-power reactors (the “reactor anomaly”), and the lower than expected rate in radioactive-source calibration data from the gallium-based solar-neutrino experiments GALLEX and SAGE (the “gallium anomaly”). Numerous other searches in appearance and disappearance channels have been conducted at various neutrino experiments with null results (including ICARUS when it operated in the CERN to Gran Sasso beam), and these have thus constrained the parameter space where light sterile neutrinos could still be hiding. A global analysis of the available data now limits the possible sterile–active mass-splitting, m2mostly sterile – m2mostly active, to a small region around 1–2 eV2

CCsbn2_05_17

Long-baseline accelerator-based neutrino experiments such as NOvA at Fermilab, T2K in Japan, and the future Deep Underground Neutrino Experiment (DUNE) in the US, which will involve detectors located 1300 km from the source, are tuned to observe oscillations related to the atmospheric mass-splitting, Δm2ATM ~ 10–3 eV2. Since the mass-squared difference between the participating states and the length scale of the oscillation they generate are inversely proportional to one another, a short-baseline accelerator experiment such as SBN, with detector distances of the order 1 km, is most sensitive to an oscillation generated by a mass-squared difference of order 1 eV2 – exactly the region we want to search.

Three detectors, one beam

The SBN programme has been designed to definitively address this question of short-baseline neutrino oscillations and test the existence of light sterile neutrinos with unprecedented sensitivity. The key to SBN’s reach is the deployment of multiple high-precision neutrino detectors, all of the same technology, at different distances along a single high-intensity neutrino beam. Use of an accelerator-based neutrino source has the bonus that both electron-neutrino appearance and muon-neutrino disappearance oscillation channels can be investigated simultaneously.

The neutrino source is Fermilab’s Booster Neutrino Beam (BNB), which has been operating at high rates since 2002 and providing beam to multiple experiments. The BNB is generated by impinging 8 GeV protons from the Booster onto a beryllium target and magnetically focusing the resulting hadrons, which decay to produce a broad-energy neutrino beam peaked around 700 MeV that is made up of roughly 99.5% muon neutrinos and 0.5% electron neutrinos.

The three SBN detectors are each liquid-argon time projection chambers (LArTPCs) located along the BNB neutrino path (see images above). MicroBooNE, an 87 tonne active-mass LArTPC, is located 470 m from the neutrino production target and has been collecting data since October 2015. The Short-Baseline Near Detector (SBND), a 112 tonne active-mass LArTPC to be sited 110 m from the target, is currently under construction and will provide the high-statistics characterisation of the un-oscillated BNB neutrino fluxes that is needed to control systematic uncertainties in searches for oscillations at the downstream locations. Finally, ICARUS, with 476 tonnes of active mass and located 600 m from the BNB target, will achieve a sufficient event rate at the downstream location where a potential oscillation signal may be present. Many of the upgrades to ICARUS implemented during its time at CERN over the past few years are in response to unique challenges presented by operating a LArTPC detector near the surface, as opposed to the underground Gran Sasso laboratory where it operated previously. The SBN programme is being realised by a large international collaboration of researchers with major detector contributions from CERN, the Italian INFN, Swiss NSF, UK STFC, and US DOE and NSF. At Fermilab, new experimental halls to house the ICARUS and SBND detectors were constructed in 2016 and are now awaiting the LArTPCs. ICARUS and SBND are expected to begin operation in 2018 and 2019, respectively, with approximately three years of ICARUS data needed to reach the programme’s design sensitivity.

A rich physics programme

In a combined analysis, the three SNB detectors allow for the cancellation of common systematics and can therefore test the νμ→ νe oscillation hypothesis at a level of 5σ or better over the full range of parameter space originally allowed at 99% C.L. by the LSND data. Recent measurements, especially from the NEOS, IceCube and MINOS experiments, have constrained the possible sterile-neutrino parameters significantly and the sensitivity of the SBN programme is highest near the most favoured values of Δm2. In addition to νe appearance, SBN also has the sensitivity to νμ disappearance needed to confirm an oscillation interpretation of any observed appearance signal, thus providing a more robust result on sterile-neutrino-induced oscillations (figure 1).

CCsbn3_05_17

SBN was conceived to unravel the physics of light sterile neutrinos, but the scientific reach of the programme is broader than just the searches for short-baseline neutrino oscillations. The SBN detectors will record millions of neutrino interactions that can be used to make precise measurements of neutrino–argon interaction cross-sections and perform detailed studies of the rather complicated physics involved when neutrinos scatter off a large nucleus such as argon. The SBND detector, for example, will see of the order 100,000 muon-neutrino interactions and 1000 electron-neutrino interactions per month. For comparison, existing muon-neutrino measurements of these interactions are based on only a few thousand total events and there are no measurements at all with electron neutrinos. The position of the ICARUS detector also allows it to see interactions from two neutrino beams running concurrently at Fermilab (the Booster and Main Injector neutrino beams), allowing for a large-statistics measurement of muon and electron neutrinos in a higher-energy regime that is important for future experiments.

In fact, the science programme of SBN has several important connections to the future long-baseline neutrino experiment at Fermilab, DUNE. DUNE will deploy multiple 10 kt LArTPCs 1.5 km underground in South Dakota, 1300 km from Fermilab. The three detectors of SBN present an R&D platform for advancing this exciting technology and are providing direct experimental activity for the global DUNE community. In addition, the challenging multi-detector oscillation analyses at SBN will be an excellent proving ground for sophisticated event reconstruction and data-analysis techniques designed to maximally exploit the excellent tracking and calorimetric capabilities of the LArTPC. From the physics point of view, discovering or excluding sterile neutrinos plays an important role in the ability of DUNE to untangle the effects of charge-parity violation in neutrino oscillations, a primary physics goal of the experiment. Also, precise studies of neutrino–argon cross-sections at SBN will help control one of the largest sources of systematic uncertainties facing long-baseline oscillation measurements.    

Closing in on a resolution

The hunt for light sterile neutrinos has continued for several decades now, and global analyses are regularly updated with new results. The original LSND data still contain the most significant signal, but the resolution on Δm2 was poor and so the range of values allowed at 99% C.L. spans more than three orders of magnitude. Today, only a small region of mass-squared values remain compatible with all of the available data, and a new generation of improved experiments, including the SBN programme, are under way or have been proposed that can rule on sterile-neutrino oscillations in exactly this region.

There is currently a lot of activity in the sterile-neutrino area. The nuPRISM and JSNS2 proposals in Japan could also test for νμ→ νe appearance, while new proposals like the KPipe experiment, also in Japan, can contribute to the search for νμ disappearance. The MINOS+ and IceCube detectors, both of which have already set strong limits on νμ disappearance, still have additional data to analyse. A suite of experiments is already currently under way (NEOS, DANSS, Neutrino-4) or in the planning stages (PROSPECT, SoLid, STEREO) to test for electron-antineutrino disappearance over short baselines at reactors, and others are being planned that will use powerful radioactive sources (CeSOX, BEST). These electron-neutrino and -antineutrino disappearance searches are highly complementary to the search modes being explored at SBN. 

The Fermilab SBN programme offers world-leading sensitivity to oscillations in two different search modes at the most relevant mass-splitting scale as indicated by previous data. We will soon have critical new information regarding the possible existence of eV-scale sterile neutrinos, resulting in either one of the most exciting discoveries across particle physics in recent years or the welcome resolution of a long-standing unresolved puzzle in neutrino physics.

LArTPCs rule the neutrino-oscillation waves
  A schematic diagram of the ICARUS liquid-argon time projection chamber (LArTPC) detector, where electrons create signals on three rotated wire planes. The concept of the LArTPC for neutrino detection was first conceived by Carlo Rubbia in 1977, followed by many years of pioneering R&D activity and the successful operation of the ICARUS detector in the CNGS beam from 2010 to 2012, which demonstrated the effectiveness of single-phase LArTPC technology for neutrino physics. A LArTPC provides both precise calorimetric sampling and 3D tracking similar to the extraordinary imaging features of a bubble chamber, and is also fully electronic and therefore potentially scalable to large, several-kilotonne masses. Charged particles propagating in the liquid argon ionise argon atoms and free electrons drift under the influence of a strong, uniform electric field applied across the detector volume. The drifted ionisation electrons induce signals or are collected on planes of closely spaced sense wires located on one side of the detector boundary, with the wire signals proportional to the amount of energy deposited in a small cell. The very low electron drift speeds, in the range of 1.6 mm/μs, require a continuous read-out time of 1–2 milliseconds for a detector a few metres across. This creates a challenge when operating these detectors at the surface, as the SBN detectors will be at Fermilab, so photon-detection systems will be used to collect fast scintillation light and time each event.

 

A selection of Fermilab’s greatest hits

Revisiting the b revolution

CCbdi1_05_17

Scientists summoned from all parts of Fermilab had gathered in the auditorium on the afternoon of 30 June 1977. Murmurs of speculation ran through the crowd about the reason for the hastily scheduled colloquium. In fact, word of a discovery had begun to leak out [long before the age of blogs], but no one had yet made an official announcement. Then, Steve Herb, a postdoc from Columbia University, stepped to the microphone and ended the speculation: Herb announced that scientists at Fermilab Experiment 288 had discovered the upsilon particle. A new generation of quarks was born. The upsilon had made its first and famous appearance at the Proton Center at Fermilab. The particle, a b quark and an anti-b quark bound together, meant that the collaboration had made Fermilab’s first major discovery. Leon Lederman, spokesman for the original experiment, described the upsilon discovery as “one of the most expected surprises in particle physics”.

The story had begun in 1970, when the Standard Model of particle interactions was a much thinner version of its later form. Four leptons had been discovered, while only three quarks had been observed – up, down and strange. The charm quark had been predicted, but was yet to be discovered, and the top and bottom quarks were not much more than a jotting on a theorist’s bedside table.

In June of that year, Lederman and a group of scientists proposed an experiment at Fermilab (then the National Accelerator Laboratory) to measure lepton production in a series of experimental phases that began with the study of single leptons emitted in proton collisions. This experiment, E70, laid the groundwork for what would become the collaboration that discovered the upsilon.

CCbdi2_05_17

The original E70 detector design included a two-arm spectrometer [for the detection of lepton pairs, or di-leptons], but the group first experimented with a single arm [searching for single leptons that could come, for example, from the decay of the W, which was still to be discovered]. E70 began running in March of 1973, pursuing direct lepton production. Fermilab director Robert Wilson asked for an update from the experiment, so the collaborators extended their ambitions, planned for the addition of the second spectrometer arm and submitted a new proposal, number 288, in February 1974 – a single-page, six-point paper in which the group promised to get results, “publish these and become famous”. This two-arm experiment would be called E288.

The charm dimension

Meanwhile, experiments at Brookhaven National Laboratory and at the Stanford Linear Accelerator Center were searching for the charm quark. These two experiments led to what is known as the “November Revolution” in physics. In November of 1974, both groups announced they had found a new particle, which was later proven to be a bound state of the charm quark: the J/psi particle.

Some semblance of symmetry had returned to the Standard Model with the discovery of charm. But in 1975, an experiment at SLAC revealed the existence of a new lepton, called tau. This brought a third generation of matter to the Standard Model, and was a solid indication that there were more third-generation particles to be found.

The Fermilab experiment E288 continued the work of E70 so much of the hardware was already in place waiting for upgrades. By the summer of 1975, collaborators completed construction on the detector. Lederman invited a group from the State University of New York at Stony Brook to join the project, which began taking data in the autumn of 1975.

One of the many legends in the saga of the b quark describes a false peak in E288’s data. In the process of taking data, several events at an energy level between 5.8 and 6.2 GeV were observed, suggesting the existence of a new particle. The name upsilon was suggested for this new particle. Unfortunately, the signals at that particular energy turned out to be mere fluctuations, and the eagerly anticipated upsilon became known as “oopsLeon”.

CCbdi3_05_17

What happened next is perhaps best described in a 1977 issue of The Village Crier (FermiNews’s predecessor): “After what Dr R R Wilson jocularly refers to as ‘horsing around,’ the group tightened its goals in the spring of 1977.” The tightening of goals came with a more specific proposal for E288 and a revamping of the detector. The collaborators, honed by their experiences with the Fermilab beam, used the detectors and electronics from E70 and the early days of E288, and added two steel magnets and two wire-chamber detectors borrowed from the Brookhaven J/psi experiment.

The simultaneous detection of two muons from upsilon decay characterised the particle’s expected signature. To improve the experiment’s muon-detection capability, collaborators called for the addition to their detector of 12 cubic feet – about two metric tonnes – of beryllium, a light element that would act as an absorber for particles such as protons and pions, but would have little effect on the sought-for muons. When the collaborators had problems finding enough of the scarce and expensive material, an almost forgotten supply of beryllium in a warehouse at Oak Ridge National Laboratory came to the rescue. By April 1977, construction was complete.

Six weeks to fame

The experiment began taking data on 15 May 1977, and saw quick results. After one week of taking data, a “bump” appeared at 9.5 GeV. John Yoh, sure but not overconfident, put a bottle of champagne labelled “9.5” in the Proton Center’s refrigerator.

But champagne corks did not fly right away. On 21 May, fire broke out in a device that measures current in a magnet, and the fire spread to the wiring. The electrical fire created chlorine gas, which when doused with water to put out the fire, created acid. The acid began to eat away at the electronics, threatening the future of E288. At 2.00 a.m. Lederman was on the phone searching for a salvage expert. He found his expert: a Dutchman who lived in Spain and worked for a German company. The expert agreed to come, but needed 10 days to get a US visa. Lederman called the US embassy, asking for an exception. Not possible, said the embassy official. Just as it began to look hopeless, Lederman mentioned that he was a Columbia University professor. The official turned out to be a Columbia graduate, class of 1956. The salvage expert was at Fermilab two days later. Collaborators used the expert’s “secret formulas” to treat some 900 electronic circuit boards, and E288 was back online by 27 May.

By 15 June, the collaborators had collected enough data to prove the existence of the bump at 9.5 GeV – evidence for a new particle, the upsilon. On 30 June, Steve Herb gave the official announcement of the discovery at the seminar at Fermilab, and on 1 July the collaborators submitted a paper to Physics Review Letters. It was published without review on 1 August.

Since the discovery of the upsilon, physicists have found several levels of upsilon states. Not only was the upsilon the first major discovery for Fermilab, it was also the first indication of a third generation of quarks. A bottom quark meant there ought to be a top quark. Sure enough, Fermilab found the top quark in 1995.

Bumps on the particle-physics road
CCbdi4_05_17

The story of “bumps” in particle physics dates back to an experiment at the Chicago Cyclotron in 1952, when Herbert Anderson, Enrico Fermi and colleagues found that the πp cross-section rose rapidly at pion energies of 80–150 MeV, with the effect about three times larger in π+p than in πp. This observation had all the hallmarks of the resonance phenomena that was well known in nuclear physics, and could be explained by a state with spin 3/2, isospin 3/2. With higher energies available at the Carnegie Synchro-cyclotron, in 1954 Julius Ashkin and colleagues were able to report that the πp cross-section fell above about 180 MeV, revealing a characteristic resonance peak. Through the uncertainty principle, the peak’s width of some 100 MeV is consistent with a lifetime of around 10–23 s. Further studies confirmed the resonance, later called Δ, with a mass of 1232 MeV in four charge states: Δ++, Δ+, Δ0 and Δ.

The Δ remained an isolated case until 1960, when a team led by Luis Alvarez began studies of Kp interactions using the 15 inch hydrogen bubble chamber at the Berkeley Bevatron. Graduate students Stan Wojcicki and Bill Marciano studied plots of the invariant mass of pairs of particles produced, and found bumps corresponding to three resonances now known as the Σ(1385), the Λ(1405) and the K*(892). These discoveries opened a golden age for bubble chambers, and set in motion the industry of “bump hunting” and the field of hadron spectroscopy. Four years later, the Δ, together with Σ and Ξ resonances, figured in the famous decuplet of spin-3/2 particles in Murray Gell-Mann’s quark model. These resonances and others could now be understood as excited states of the constituents – quarks – of more familiar longer-lived particles.

By the early 1970s, the number of broad resonances had grown into the hundreds. Then came the shock of the “November Revolution” of 1974. Teams at Brookhaven and SLAC discovered a new, much narrower resonance in experiments studying, respectively, pBe  e+eX and e+e annihilation. This was the famous J/psi, which after the dust had settled was recognised as the bound state of a predicted fourth quark, charm, and its antiquark. The discovery of the upsilon, again as a narrow resonance formed from a bottom quark and antiquark, followed three years later (see main article). By the end of the decade, bumps in appropriate channels were revealing a new spectroscopy of charm and bottom particles at energies around 4 GeV and 10 GeV, respectively.

This left the predicted top quark, and in the absence of any clear idea of its mass, over the following years searches at increasingly high energies looked for a bump that could indicate its quark–antiquark bound state. The effort moved from e+e colliders to the higher energies of p–p machines, and it was experimental groups at Fermilab’s Tevatron that eventually claimed the first observation of top quarks in 1995, not in a resonance, but through their individual decays.

However, important bumps did appear in p-p collisions, this time at CERN’s SPS, in the experiments that discovered the W and Z bosons in 1983. The bumps allowed the first precise measurements of the masses of the bosons. The Z later became famous as an e+e resonance, in particular at CERN’s LEP collider. The most precisely measured resonance yet, the Z has a mass of 91.1876±0.0021 GeV and a width of 2.4952±0.0023 GeV.

However, a more recent bump is probably still more famous – the Higgs boson as observed in 2012. In data from the ATLAS and CMS experiments, small bumps around 125 GeV in the mass spectrum in the four-lepton and two-photon channels, respectively, revealed the long-sought scalar boson (CERN Courier September 2012 p43 and p49).

Today, bump-hunting continues at machines spanning a huge range in energy, from the BEPC-II e+e collider, with a beam energy of 1–2.3 GeV in China, to the CERN’s LHC, operating at 6.5 TeV per beam. Only recently, LHC experiments spotted a modest excess of events at an energy of 750 GeV;  although the researchers cautioned that it was not statistically significant, it still prompted hundreds of publications on the arXiv preprint server. Alas indeed, on this occasion as on others over the decades, the bump faded away once larger data sets were recorded.

Nevertheless, with the continuing searches for new high-mass particles, now as messengers for physics beyond the Standard Model, and searches at lower energies providing many intriguing bumps, who knows where the next exciting “bump” might appear?

• Christine Sutton, formerly CERN.

 

Baksan scales new neutrino heights

On 29 June 1967, the Soviet government issued a document that gave the go-ahead to build a brand new underground facility for neutrino physics in the Baksan valley in the mountainous region of the Northern Caucasus. Construction work began straight away on the tunnels under the 4000 m-high peak of Mount Andyrchi that would contain the experimental halls, and 10 years later, the laboratory’s first neutrino telescope started operation. Today, a varied experimental programme continues at the Baksan Neutrino Observatory, which is operated by the Institute for Nuclear Research (INR) of the Russian Academy of Sciences (RAS). And there is the promise of more to come.

The detailed proposal for the Baksan Neutrino Observatory was put together by “the father of neutrino astronomy”, Moisei Markov, and his younger colleagues, Alexander Chudakov, George Zatsepin and Alexander Pomansky, together with many others. The decision to construct a dedicated underground facility rather than use an existing mine – something that had never been done before – gave the scientists the freedom to choose the location and the structure of their future laboratory to maximise its scientific output. Their proposal to house it in an almost horizontal tunnel under a steep mountain decreased the construction costs by a factor of six with respect to a mine, while maintaining higher safety standards. They selected Andyrchi – one of a series of peaks dominated by Europe’s highest mountain, Mount Elbrus (5642 m) – from many potential sites. The entrance to the laboratory tunnel is located in the valley below the peaks, which is well known to mountaineers, hikers and skiers, at an altitude of 1700 m. A small village called Neutrino was built to accommodate scientists and engineers working for the observatory, with office and laboratory buildings, some surface installations, living quarters and related infrastructure.

The basic idea of underground neutrino detection is to use soil and rock to shield the installations from muons produced in cosmic-ray interactions with the atmosphere – the main background for neutrino detection. The underground complex at Baksan contains two interconnected tunnels (having two is a safety requirement) with laboratory halls situated at various distances along the tunnels, corresponding to different shielding conditions below the mountain. At the end of the 4 km-long tunnels, the flux of the muons is suppressed by almost 10 million times with respect to the surface.

Experiments past

The first experiment to start at Baksan, back in 1973, was not however underground. The Carpet air-shower experiment completely covered an area of around 200 m2 with 400 liquid-scintillator detectors, identical to those of the first neutrino telescope “BUST” (see below). Its key task was a detailed study of the central part of air showers produced by cosmic particles in the atmosphere. One of its first results, based on the interpretation of shower sub-cores as imprints of jets with high transverse-momenta – born in the primary interactions of the cosmic rays – was on the production cross-section of these jets for leading-particle energies up to 500 GeV. This result was published earlier than the corresponding measurement at CERN’s Super Proton Synchrotron and confirmed predictions of quantum chromodynamics. Carpet’s discoveries of astrophysical importance included a puzzling giant flare in the Crab Nebula in 1989.

The Baksan Underground Scintillator Telescope (BUST) started operation in 1977. A multipurpose detector, it is located in an artificial cavern with a volume of 12,000 m3 located 550 m from the tunnel entrance. The telescope is a four-level underground building 11.1 m high with a base area of 280 m2. The building, made of low-radioactivity concrete, houses 3180 detectors containing 330 tonnes of liquid scintillator. Sensitive to cosmic neutrinos with energies of dozens of MeV, the detector is well suited to the search for supernova neutrinos, and on 23 February 1987 it was one of four detectors in the world that registered the renowned neutrino signal from the supernova 1987A in the Large Magellanic Cloud. The results obtained with the telescope have been used for cosmic-ray studies, searches for exotic particles (notably, magnetic monopoles) and neutrino bursts.

Neutrinos with lower energies were the target of the Gallium–Germanium Neutrino Telescope, a pioneering device to search for solar neutrinos in the SAGE (Soviet–American Gallium Experiment) project. The first experiments to detect neutrinos from the Sun – Homestake in the US and Kamiokande II in Japan – registered neutrinos with energies of a few MeV, which are mainly produced in the decay of boron-8 and constitute less than 1% of the total solar-neutrino flux. These Nobel-prize-winning experiments revealed the solar-neutrino deficit, subsequently interpreted in terms of neutrino oscillations, the only firm laboratory indication so far for the incompleteness of the Standard Model of particle physics. However, to assess the problem fully, it was necessary to find out what happens with the bulk (86% of the total flux) of the solar neutrinos, which come from proton–proton (pp) fusion reactions and have energies below about 0.4 MeV.

In 1965, Vadim Kuzmin proposed using the reaction 71Ga + νe→ 71Ge + e to detect the low-energy solar neutrinos. This idea was implemented in two experiments: GALLEX in the Gran Sasso National Laboratory and SAGE at Baksan. SAGE, which has been in operation since 1986 and is led by Vladimir Gavrin, is located 3.5 km from the tunnel entrance, where the cosmic-ray muon flux is suppressed by a factor of several million. About 50 tonnes of liquid gallium are used as a target; amazingly, a special factory was built to produce this amount of gallium, which exceeded the total consumed by the Soviet Union at the time. A unique chemical technology was developed to allow about 15 germanium atoms to be extracted from the 50 tonnes of gallium every month.

SAGE and GALLEX were the first experiments to detect solar pp neutrinos and to confirm the solar-neutrino deficit for the bulk of the flux. Combined with results from other experiments to subtract sub-leading contributions from other channels, SAGE found the solar pp neutrino flux to be 6.0±0.8 × 1010 cm–2 s–1. This agrees nicely with the solar-model prediction (taking into account neutrino oscillations) of 5.98±0.04 × 1010 cm–2 s–1 and the result has been confirmed by the 2014 measurement by Borexino, using a different method, which gives 6.6±0.7 × 1010 cm–2 s–1.

The unique underground conditions at Baksan also allowed the creation of several ultra-low-background laboratories where, in addition to the natural shielding, materials with extremely low radioactivity were used in construction. There are three shielded chambers at different depths where rare nuclear processes have been searched for and a number of low-background experiments performed, including a precise measurement of the isotopic composition of the lunar soil delivered by the Luna-16, Luna-20 and Luna-24 spacecraft.

Current experiments

Now 50, the Baksan Neutrino Observatory continues to probe the neutrino frontier. The scintillator telescope is still monitoring the universe for neutrino bursts, its almost 40 year exposure time setting stringent constraints on the rate of core-collapse supernova in the Milky Way. The non-observation of neutrinos associated with the gravitational-wave event of 15 September 2015, detected by the LIGO Observatory, puts a unique constraint on the associated flux of neutrinos with energies of 1–100 GeV, complementary to constraints from larger experiments at different energies.

Calibration of the gallium solar-neutrino experiments, SAGE and GALLEX, with artificial neutrino sources has revealed the so-called gallium anomaly, which can be understood in terms of a new, sterile-neutrino state. A new experiment called the Baksan Experiment on Sterile Transitions (BEST), has been instigated to check the anomaly and thus test the sterile-neutrino hypothesis. This will be based on a 51Cr artificial neutrino source with an intensity of around 100 PBq, placed in the centre of a spherical gallium target of two concentric zones with equal neutrino mean-free-paths; any significant difference in the rate of neutrino capture in the inner and outer zones would indicate the existence of a sterile neutrino. CrSOX, a similar experiment with the Borexino detector at Gran Sasso, might become competitive with BEST but only in its full-scale configuration with the 400 PBq neutrino source. Reactor experiments would provide complementary information about a sterile antineutrino.

BEST is now fully constructed and is awaiting the artificial neutrino source. Meanwhile, ultra-pure gallium is still used in the SAGE experiment, confirming the stability of the solar-neutrino flux over decades: fortunately, the Sun is not about to change its power output.

Numerous experiments are being carried out in the low-background laboratory, thanks to a new experimental hall – Low Background Lab 3 on the figure above – located 3.67 km from the tunnel entrance (providing shielding equivalent of 4900 m of water). One of them searches for solar axions via their resonant reconversion on 83Kr, and this experiment has already resulted in the world’s best constraint on certain couplings of the hadronic axion.

Among the surface-based experiments, the Carpet air-shower array is undergoing the most intense development. Equipped with a brand new muon detector with an area of 410 m2, this old cosmic-ray installation is starting a new life as a sophisticated sub-PeV gamma-ray telescope. A world-best sensitivity to the diffuse gamma-ray flux above 100 TeV, which could be achieved by the end of 2017, would be sufficient to decide between the galactic and extragalactic origin of the high-energy astrophysical neutrinos detected by the IceCube neutrino observatory at the South Pole.

Other experiments are also ready to produce interesting results. The Andyrchi air-shower array located on the slope of the mountain above BUST works in coincidence with the telescope, which serves as a muon detector with a 120 GeV threshold. A small gravitational-wave detector, OGRAN, capable of registering a galactic supernova, makes Baksan a true multi-messenger observatory. In addition, important interdisciplinary studies are taking place at the border with geophysics. They include not only deep-underground precise monitoring of seismic and magnetic parameters close to the sleeping volcano Elbrus, but also, for example, studies of atmospheric electricity and its relation to the cosmic-ray muon flux.

Future prospects

Looking ahead, the Baksan Neutrino Observatory could host new breakthrough experiments. The many planned projects include a further upgrade of the Carpet array with the increase in both the surface-array and muon-detector areas for the purposes of sub-PeV gamma-ray astronomy; a new resonant-reconversion solar axion experiment with a sensitivity an order of magnitude better than the present one; and a circular laser interferometer – or Sagnac gyroscope – for geophysics and fundamental-physics measurements.

However, the main project for the observatory is the Baksan Large-Volume Scintillator Detector (BLVSD, although the name of the experiment is yet to be fixed). This detector, currently at the R&D stage, should contain 10–20 kilotonnes of ultra-pure liquid scintillator and could be located at the end of the observatory tunnel. There, unused artificial caverns exist in which a Cl–Ar solar-neutrino experiment was originally planned, but was replaced by the SAGE Ga–Ge detector in a different cave. This large-volume detector should be able to detect not only neutrinos from a galactic core-collapse supernova, but also the composite neutrino background of numerous distant explosions, thus making it possible to study supernova neutrinos in the unlucky, but probable, case that no galactic explosion happens in the coming decades. In the opposite case, the large neutrino statistics from a nearby explosion would open up possibilities for a detailed study.

For solar neutrinos, BLVSD would be capable of measuring the neutrino flux from the carbon–nitrogen–oxygen (CNO) fusion cycle in the Sun with a precision sufficient to discriminate between various solar models and therefore solve experimentally the present-day contradiction between results from helioseismology and those from chemical-composition studies of the solar surface. A primary target for BLVSD would be the study of geoneutrinos, which are produced in nuclear decays in the Earth’s interior. Clearly, the detector could also be used for a precise study of neutrino oscillations, in particular with a dedicated long-distance accelerator beam.

BLVSD would join a global network of large-scale neutrino detectors, if created. Such joint operation would open possibilities to solve many interesting problems. It would allow, for instance, the inclusion of effects of the inhomogeneous structure of the Earth’s crust in geoneutrino studies, or the determination of the direction of a supernova that is obscured and visible only in neutrinos. The unique conditions at the Baksan observatory would also make the solo operation of BLVSD efficient. Not only do the existing infrastructure and experience allow for ultra-low-background experiments, but the geographical position in the Northern Caucasus guarantees a large distance from nuclear reactors. For geoneutrinos, estimates of the ratio of the signal counting-rate to the background from artificial reactors give a value around 5 for Baksan, compared to around 1.1 for Borexino and around 0.15 for KamLand. It is the low background that would allow a precise measurement of the solar CNO flux, which is barely possible in any of the currently operating experiments.

The large-scale BLVSD project is still in its infancy, and numerous efforts in R&D, fundraising and construction are still to be made. The Baksan Neutrino Observatory is fully open for worldwide collaboration and co-operation, both in this and in other scientific projects. Happy birthday, Baksan.

A new era for particle physics

Driven by technology, scale, geopolitical reform, and even surprising discoveries, the field of particle physics is changing dramatically. This evolution is welcomed by some and decried by others, but ultimately is crucial for the survival of the field. The next 50 years of particle physics will be shaped not only by the number, character and partnerships of labs around the world and their ability to deliver exciting science, but also by their capacity to innovate.

Increasingly, the science we wish to pursue as a field has demanded large, even mega-facilities. CERN’s LHC is the largest, but the International Linear Collider (ILC) project and plans for a large circular collider in China are not far behind, and could even be new mega-science facilities. The main centres of particle physics around the world have changed significantly in the last several decades due to this trend. But a new picture is emerging, one where co-ordination among and participation in these mega-efforts is required at an international level. A more co-ordinated global playing field is developing, but “in transition” would best describe the situation at present.

Europe and North America have clear directions of travel. CERN has emerged as the leader of particle physics in Europe; the US has only Fermilab devoted entirely to particle physics. Other national labs in the US and Europe have diversified into other areas of science and some have found a niche for particle physics while also connecting strongly to the research programmes at Fermilab and CERN. Indeed, in the US a network of laboratories has emerged involving SLAC, Brookhaven, Argonne and Berkeley. Once quite separate, now these labs contribute to the LHC and to the neutrino programme based at Fermilab. Furthermore, each lab is becoming specialised in certain science and technology areas, while their collective talents are being developed and protected by the Department of Energy. As a result, CERN and Fermilab are stronger partners than ever before. Likewise, several European labs are now engaged or planning to engage in the ambitious neutrino programme being hosted by Fermilab, for which CERN continues to be the largest partner.

Will this trend continue, with laboratories worldwide functioning more like a network, or will competition slow down the inevitable end game? There are at least a couple of wild cards in the deck at the moment. Chief issues for the future are how aggressively and at what scale will China enter the field with its Circular Electron Positron Collider and Super Proton–Proton Collider projects, and whether Japan will move forward to build the ILC. The two projects have obvious science overlap and some differences, and if either project moves forward, Europe and the US will want to be involved and the impact could be large.

The global political environment is also fast evolving, and science is not immune from fiscal cuts or other changes taking place. While it is difficult to predict the future, fiscal austerity is here to stay for the near term. The consequences may be dramatic or could continue the trend of the last few decades, shrinking and consolidating our field. Rising above this trend will take focus, co-ordination and hard work. More importantly than ever, the worldwide community needs to demonstrate the value of basic science to funding stakeholders and the public, and to propose compelling reasons why the pursuit of particle physics deserves its share of funding.

Top-quality, world-leading science should be the primary theme, but it is not enough. The importance of social and economic impact will loom ever larger, and the usual rhetoric about it taking 40 years to reap the benefits from basic research – even if true – will no longer suffice. Innovation, more specifically the role of science in fuelling economic growth, is the favourite word of many governments around the world. Particle physics needs to be engaged in this discussion and contribute its talent. Is this a laboratory effort, a network of laboratories effort, or a global effort? For the moment, innovation is local, sometimes national, but our field is used to thinking even bigger. The opportunity to lead in globalisation is on our doorstep once again.

Theory of Quantum Transport at Nanoscale: An Introduction

By Dmitry A Ryndyk
Springer

51IDr5vOk+L

This book provides an introduction to the theory of quantum transport at the nanoscale – a rapidly developing field that studies charge, spin and heat transport in nanostructures and nanostructured materials. The theoretical models and methods recollected in the volume are widely used in nano-, molecular- and bio-electronics, as well as in spin-dependent electronics (spintronics).

The book begins by introducing the basic concepts of quantum transport, including the Landauer–Büttiker method; the matrix Green function formalism for coherent transport; tunnelling (transfer) Hamiltonian and master equation methods for tunnelling; Coulomb blockade; and vibrons and polarons.

In the second part of the book, the author gives a general introduction to the non-equilibrium Green function theory, describing first the approach based on the equation-of-motion technique, and then a more sophisticated one based on the Dyson–Keldysh diagrammatic technique. The book focuses in particular on the theoretical methods able to describe the non-equilibrium (at finite voltage) electron transport through interacting nanosystems, specifically the correlation effects due to electron–electron and electron–vibron interactions.

The book would be useful for both masters and PhD students and for researchers or professionals already working in the field of quantum transport theory and nanoscience.

Thermodynamics and Equations of State for Matter: From Ideal Gas to Quark–Gluon Plasma

By Vladimir Fortov
World Scientific

41DLBMZ6iqL._SX312_BO1,204,203,200_

This monograph presents a comparative analysis of different thermodynamic models of the equation of state (EOS). The author aims to present in a unified way both the theoretical methods and experimental material relating to the field.

Particular attention is given to the description of extreme states reached at high pressure and temperature. As a substance advances along the scale of pressure and temperature, its composition, structure and properties undergo radical changes, from the ideal state of non-interacting neutral particles described by the classical statistical Boltzmann function to the exotic forms of baryon and quark–gluon matter.

Studying the EOS of matter under extreme conditions is important for the study of astrophysical objects at different stages of their evolution as well as in plasma, condensed-matter and nuclear physics. It is also of great interest for the physics of high-energy concentrations that are either already attained or can be reached in the near future under controlled terrestrial conditions.

Ultra-extreme astrophysical and nuclear-physical applications are also analysed. Here, the thermodynamics of matter is affected substantially by relativity, high-power gravitational and magnetic fields, thermal radiation, the transformation of nuclear particles, nucleon neutronisation, and quark deconfinement.

The book is intended for a wide range of specialists who study the EOS of matter and high-energy-density physics, as well as for senior students and postgraduates.

Big Data: Storage, Sharing, and Security

By Fei Hu (ed.)
CRC Press

CCboo2_04_17

Nowadays, enormous quantities of data in a variety of forms are generated rapidly in fields ranging from social networks to online shopping portals to physics laboratories. The field of “big data” involves all the tools and techniques that can store and analyse such data, whose volume, variety and speed of production are not manageable using traditional methods. As such, this new field requires us to face new challenges. These challenges and their possible solutions are the subject of this book of 17 chapters, which is clearly divided into two sections: data management and security.

Each chapter, written by different authors, describes the state-of-the-art for a specific issue that the reader may face when implementing a big-data solution. Far from being a manual to follow step-by-step, topics are treated theoretically and practical uses are described. Every subject is very well referenced, pointing to many publications for readers to explore in more depth.

Given the diversity of topics addressed, it is difficult to give a detailed opinion on each of them, but some deserve particular mention. One is the comparison between different communication protocols, presented in depth and accompanied by many graphs that help the reader to understand the behaviour of these protocols under different circumstances. However, the black-and-white print makes it difficult to differentiate between the lines in these graphs. Another topic that is nicely introduced is the SP (simplicity and power) system, which makes use of innovative solutions to aspects such as the variety of data when dealing with huge amounts. Even though the majority of the topics in the book are clearly linked to big data, some of them are related to broader computing topics such as deep-web crawling or malware detection in Android environments.

Security in big-data environments is widely covered in the second section of the  book, spanning cryptography, accountability and cloud computing. As the authors point out, privacy and security are key: solutions are proposed to successfully implement a reliable, safe and private platform. When managing such amounts of data, privacy needs to be carefully treated since delicate information could be extracted.  The topic is addressed in several chapters from different points of view, from looking at outsourced data to accountability and integrity. Special attention is also given to cloud environments, since they are not as controlled as those “in house”. Cloud environments may require data to be securely transmitted, stored and analysed to avoid access by unauthorised sources. Proposed approaches to apply security include encryption, authorisation and authentication methods.

The book is a good introduction to many of the aspects that readers might face or want to improve in their big-data environment.

Challenges and Goals for Accelerators in the XXI Century

By Oliver Brüning and Stephen Myers (eds)
World Scientific

Also available at the CERN bookshop

CCboo1_04_17

This mighty 840 page book covers an impressive range of subjects divided into no less than 45 chapters. Owing to the expertise and international reputations of the authors of the individual chapters, few if any other books in this field have managed to summarise such a broad topic with such authority. While too numerous to list in the space provided, the full list of authors – a veritable “who’s who” of the accelerator world – can be viewed at worldscientific.com/worldscibooks/10.1142/8635#t=toc.

The book opens with two chapters devoted to a captivating historical review of the Standard Model and a general introduction to accelerators, and closes with two special sections. The first of these is devoted to novel accelerator ideas: plasma accelerators, energy-recovery linacs, fixed-field alternating-gradient accelerators, and muon colliders. The last section describes European synchrotrons used for tumour therapy with carbon ions and covers, in particular, the Heidelberg Ion Therapy Centre designed by GSI and the CERN Proton Ion Medical Machine Study. The last chapter describes the transformation of the CERN LEIR synchrotron into an ion facility for radiobiological studies.

Concerning the main body of the book, 17 chapters look back over the past 100 years, beginning with a concise history of the three first lepton colliders: AdA in Frascati, VEP-1 in Novosibirsk and the Princeton–Stanford electron–electron collider. A leap in time then takes the reader to CERN’s Large Electron–Positron collider (LEP), which is followed by a description of the Stanford Linear Collider. Unfortunately, this latter chapter is too short to do full justice to such an innovative approach to electron–positron collisions.

The next section is devoted to beginnings, starting from the time of the Brookhaven Cosmotron and Berkeley Bevatron. The origin of alternating-gradient synchrotrons is well covered through a description of the Brookhaven AGS and the CERN Proton Synchrotron. The first two hadron colliders at CERN – the Intersecting Storage Rings (ISR) and the Super Proton Synchrotron (SPS) proton–antiproton collider – are then discussed. The ISR’s breakthroughs were numerous, including the discovery of Schottky scans, the demonstration of stochastic cooling and absolute luminosity measurements by van der Meer scans. Even more remarkable was the harvest of the SPS proton–antiproton collider, culminating with the Nobel prize awarded to Carlo Rubbia and Simon van der Meer. The necessary Antiproton Accumulator and Collector are discussed in a separate chapter, which ends with an amusing recollection: “December 1982 saw the collider arriving at an integrated luminosity of 28 inverse nanobarns and Rubbia offering a ‘champagne-only’ party with 28 champagne bottles!” Antiproton production methods are covered in detail, including a description of the manoeuvres needed to manipulate antiproton bunches and of the production of cold antihydrogen atoms. This subject is continued in a later chapter dedicated to CERN’s new ELENA antiproton facility.

The Fermilab proton–antiproton collider started later than the SPS, but eventually led to the discovery of the top quark by the CDF and D0 collaborations. The Fermilab antiproton recycler and main ring are described, followed by a chapter dedicated to the Tevatron, which was the first superconducting collider. The first author remarks that, over the years, some 1016 antiprotons were accumulated at Fermilab, corresponding to about 17 nanograms and more than 90% of the world’s total man-made quantity of nuclear antimatter. This section of the book concludes with a description of the lepton–proton collider HERA at DESY, the GSI heavy-ion facility, and the rare-isotope facility REX at ISOLDE. Space is also given to the accelerator that was never built, the US Superconducting Super Collider (SSC), of which “the hopeful birth and painful death” is recounted.

The following 25 chapters are devoted to accelerators for the 21st century, with the section on “Accelerators for high-energy physics” centred on the Large Hadron Collider (LHC). In the main article, magisterially written, it is recalled that the 27 km length of the LEP tunnel was chosen having already in mind the installation of a proton–proton collider, and the first LHC workshop was organised as early as 1984. The following chapters are dedicated to ion–ion collisions at the LHC and to the upgrades of the main ring and the injector. The high-energy version of the LHC and the design of a future 100 km-circumference collider (with both electron–positron and proton–proton collision modes) are also covered, as well as the proposed TeV electron–proton collider LHeC. The overall picture is unique, complete and well balanced.

Other chapters discuss frontier accelerators: super B-factories, the BNL Relativistic Heavy Ion Collider (RHIC)  and its electron–ion extension, linear electron–positron colliders, electron–positron circular colliders for Higgs studies and the European Spallation Source. Special accelerators for nuclear physics, such as the High Intensity and Energy ISOLDE at CERN and the FAIR project at GSI, are also discussed. Unfortunately, the book does not deal with synchrotron light sources, free electron lasers and high- power proton drivers. However, the latter are discussed in connection with neutrino beams by covering the CERN Neutrinos to Gran Sasso project and neutrino factories.

The book is aimed at engineers and physicists who are already familiar with particle accelerators and may appreciate the technical choices and stories behind existing and future facilities. Many of its chapters could also be formative for young people thinking of joining one of the described projects. I am convinced that these readers will receive the book very positively.

bright-rec iop pub iop-science physcis connect