Comsol -leaderboard other pages

Topics

Quark–Gluon Plasma 5

By Xin-Nian Wang (ed.)
World Scientific

41w1vk9pK3L._SX338_BO1,204,203,200_

As the fifth volume in a series on quark–gluon plasma (QGS), this text provides an update on the recent advances in theoretical and phenomenological studies of QGS. Quark–gluon plasma (also informally called “quark soup”) is a state of matter in quantum chromodynamics (QCD) hypothesised to exist at extremely high temperatures and densities, in which the constituents of hadrons, i.e quarks and gluons, are in a special condition of high freedom.

The book is a collection of articles written by major international experts in the field, with the aim to meet the needs of both novices – thanks to its pedagogical and comprehensive approach – and experienced researchers.

A significant amount of space is given – of course – to the impressive progress in experimental and theoretical studies of new forms of matter in high-energy heavy-ion collisions at RHIC, as well as at the LHC. The strong coupled quark–gluon plasma (sQGP) discovered at RHIC has attracted the attention of many researchers and defined the path for future studies in the field. At the same time, the heavy-ion collisions at unprecedented high energies at the LHC have opened up new lines of research.

This updated and detailed overview of QGS joins the previous four volumes in the series, which altogether present a comprehensive and essential review of the subject, both for beginners and experts.

Images of Time: Mind, Science, Reality

By George Jaroszkiewicz
Oxford University Press

CCboo1_04_16

For ages, sundials have been used to measure time, with typical accuracies in the order of a few minutes. After Galileo discovered that the small oscillations of a pendulum are isochronous, Huygens built the first prototype of a pendulum clock reaching the remarkable accuracy of a few seconds. Today, improved measurements of time and frequency are at the heart of quantum electrodynamics (QED) precision tests. The anomalous magnetic moment of the muon is measured with an accuracy of more than one part in a billion. The global-positioning system (GPS) and satellite communications, as well as other technological applications, are based (directly or indirectly) on accurate measurements of time.

There are some who argue that, while time is measured accurately, its nature is debatable in so far as it appears ubiquitously in physics (from the second law of thermodynamics to the early universe) but often with slightly different meanings. There are even some who claim that time is a mystery whose foundations are sociological, biological and psychological. This recent work by George Jaroszkiewicz suggests that different disciplines (or even different areas of physics) elaborated diverse images of time through the years. The ambitious and erudite purpose of the book is to collect all of the imageries related to the conceptualisation of time, with particular attention to the physical sciences.

The book is neither a treatise on the philosophy of science nor is it a monograph of physics. The author tries to find a balance between physical concepts and philosophical digressions, but this goal is not always achieved: various physical concepts are introduced by insisting on a mathematical apparatus that seems, at once, too detailed for the layman and too sketchy for the scholar. Through the book’s 27 chapters (supplemented by assorted mathematical appendices), the reader is led to reflect on the subjective, cultural, literary, objective, and even illusionary, images of time. Each chapter consists of various short subsections, but the guiding logic of the chapter is sometimes lost in the midst of many interesting details. The overall impression is that different branches of physics deal with multiple images of time. Because these conceptualisations are not always consistent, time is perceived by the reader (and partly presented by the author) as an enigmatic theme of speculation. A malicious reader might even infer that after nearly five centuries of Galilean method, the physicists are dealing daily with something they do not quite understand.

This knowledgeable review of the different images of time is certainly valuable, but it fails to explain why improved measurements of time and frequency are correlated with the steady development of modern science in general and of physics in particular. The truth is that physical sciences thrive from a blend of experiments, theories and enigmas: without mysteries driving our curiosity, we would not know why we should accurately measure, for instance, the anomalous magnetic moment of the muon. However, by only contemplating time as an enigma, we would probably still be stuck with sundials.

The LHC: Run 2 has restarted

At the end of March, the LHC opened its doors to allow particles to travel around the ring for the first time since the year-end technical stop began in December 2015. Progress was good, and the phase of recommissioning with beam could rapidly start. The LHC team worked with low-intensity beam for a few weeks to re-commission all systems and to check all aspects of beam-based operation, to ensure that the LHC was fully safe before declaring “stable beams” – the signal that the experiments could start taking data.

Before the protons could circulate again, the machine underwent the final phase of preparation – known as the machine checkout. During this phase, all of the LHC’s systems are put through their paces without beam. A key part of the process is driving the magnetic circuits, radiofrequency accelerating cavities, collimators, transverse dampers, etc, repeatedly through the nominal LHC cycle.

A full programme of beam instrumentation checks took place, to ensure sure that active elements were working and that the complex acquisition chain was functioning properly. Detailed checks were performed on the collimation systems.

The radiofrequency system was re-commissioned and the LHC beam-dump system was subject to stringent operational checks. In parallel, a pilot beam extracted from the Super Proton Synchrotron (SPS) was sent down the two SPS-LHC transfer lines to the beam dumps just before the start of the LHC.

While the machine checkout was ongoing, the experiments were finishing their own last interventions before the closure of the caverns.

2015 saw the start of Run 2 for the LHC, during which the proton–proton collision energy reached 13 TeV. Beam intensity has increased, and by the end of the 2015 run, 2240 proton bunches per beam were being collided. This year, the aim is to increase the number of bunches even further, to the target of 2748. The goal is to reach an integrated luminosity of around 25 inverse femtobarns (fb–1), up from the 4 fb–1 reached by the end of last year. One fb–1 corresponds to around 80 million million collisions.

At the heart of every LHC collision

At the heart of every LHC collision are the constituents of protons: the quarks and gluons, collectively known as partons. These partons can undergo hard-scattering processes, producing a plethora of final states ranging from the massless to the very massive, such as W and Z bosons or top-quark pairs. Understanding these production cross-sections and their evolution as a function of the centre-of-mass energy, √s, of the LHC are important components to understanding all of the measurements performed by ATLAS, including searches for new physics beyond the Standard Model.

Figure 1 illustrates some of the cross-section measurements made by ATLAS at √s = 7, 8 and 13 TeV. The new 13 TeV data collected in 2015 greatly extend the lever arm of the investigation of the √s evolution, with increased cross-sections for W and Z bosons and top-quark pairs by factors of approximately two and three, respectively, from their values at 8 TeV.

The final states observed from hard scattering tell a story of which partons participated in the collisions: e.g. top-quark production is related to the gluon composition of the proton, whereas Z-boson production provides insight into the quark sea, and W-boson production on the relationship between the valence quarks. These measurements are pieces of the proton puzzle, and because the √s evolution changes the range of the parton momentum fractions probed by the collisions, the 13 TeV data open up a new kinematic region of investigation.

Via hard scattering, one can also test the predictions of perturbative QCD – a key component of the Standard Model. Single and dibosons are currently predicted at next-to-next-to-leading order (NNLO), and top-quark pair production at NNLO plus next-to-next-to-leading log (NNLL). As √s increases, the mix of the hard-scattering processes changes, and the precision measurements become increasingly dependent on the knowledge of growing electroweak corrections currently available at NLO. With higher √s, rarer processes like Z-boson pair production (ZZ) become more accessible and open an enticing window onto potential new physics.

As is evident from figure 1, results match well with Standard Model expectations. Apart from a common beam-luminosity uncertainty, the measurements at 13 TeV have an experimental precision ranging from under 1% for Z bosons, to 3% for W bosons and top-quark pairs, to 14% for ZZ – the latter still being dominated by statistical uncertainties. However, measuring ratios of cross-sections can benefit from the cancellation of many experimental uncertainties. This is evident from the W+/W cross-section ratio at 13 TeV, which has a total systematic uncertainty of less than 1%, rivalling the precision of the current predictions of parton-distribution functions but whose central value is consistently lower than predictions. Results such as those presented here will contribute significantly to the understanding of the large 13 TeV data set expected in the coming years.

ALICE finds a new source of charmonium

The ALICE collaboration has studied the production of charmonium – bound states of charm and anti-charm quarks – in hadronic as well as in ultra-peripheral collisions of lead nuclei at √sNN = 2.76 TeV. In the latter case, the nuclei do not overlap, and the charmonium is produced through a photonuclear interaction (CERN Courier November 2012 p9). Recently, however, ALICE has found a clear signal for what appears to be photoproduction of J/ψ mesons, the lowest vector state of charmonia, also in collisions with significant nuclear overlap.

The nuclear overlap of heavy-ion collisions can be classified based on centrality, which is expressed as a percentile between 0 and 100%, corresponding to head-on and grazing collisions, respectively, or expressed in terms of the impact parameter, which is the distance between the centres of the two colliding nuclei in a plane that is transverse to the beam axis (CERN Courier May 2013 p31). The signal is most clearly seen in the transverse-momentum (pT) distribution shown in figure 1. Hadronically produced J/ψ mesons have a mean pT around 2 GeV/c, and the spectrum shown in figure 1 is consistent with hadronic production down to a pT of about 0.3 GeV/c. Below this value, there is a very strong excess, which cannot be reproduced by any model assuming hadronic production, but which is consistent with the sum of the expected hadronic production plus a contribution from coherently photoproduced J/ψ. This last contribution is shown with the Monte Carlo template in the figure. The yield in this region of phase space is about a factor of seven above what is expected from a scaling of the hadronic yield with the number of binary nucleon–nucleon collisions. This unexpectedly large value implies that there is a physics process at play that has not been taken into account in currently available models. Assuming that the underlying process is photoproduction, ALICE obtained the corresponding cross-section.

While in hadronic collisions the nuclei break, they each act as one entity in coherent photoproduction, where the smallness of the pT is related, via the Heisenberg uncertainty principle, to the size of the lead nucleus. Interestingly, current models of coherent photoproduction integrated over the impact-parameter range corresponding to peripheral collisions predict cross-sections with the right magnitude.

New Pb–Pb collision data at √sNN = 5.02 TeV recorded by ALICE in 2015 should allow us to quantify this excess with higher precision and to evaluate its strength in more central collisions. Whether this new source of very-low-pT J/ψ will provide an additional probe of the properties of the QGP remains an open question.

CMS updates its search for diphoton resonances

In December 2015, just a few weeks after the end of the initial LHC Run 2 period recording proton–proton collisions at the world-record collision energy of 13 TeV, CMS and ATLAS presented several new results based on this novel data. These results were eagerly anticipated: at this centre-of-mass energy, new particles heavier than 1–2 TeV could be produced over 10 times more frequently than during Run 1.

The results presented by CMS were based on a data set corresponding to an integrated luminosity of ~ 2.7 fb–1. Because of the short time between the end of data-taking and the presentation of the results, only preliminary calibrations could be applied. However, these were not all of the data that CMS recorded: an additional 0.6 fb–1 were collected without a magnetic field (0 T data set). The cryogenic plant delivering the necessary liquid helium to operate the superconducting solenoid was disrupted during 2015 by the presence of contaminants. The filters inside of the cryogenic plant had to be regenerated several times, in conjunction with the magnet being ramped down. Before continuing the story of the 0 T data set, we want to reassure the reader that the system underwent an extensive programme of cleaning and maintenance during the end-of-year technical stop, and it is now on track for reliable operation in 2016.

The perfect candidate analysis for these data is the search for resonances in the diphoton final state. Preliminary results for this search, shown by CMS and ATLAS in December 2015, generated significant interest within the high-energy community because of a simultaneous excess of data with respect to the expected background seen by both experiments at a diphoton mass of about 750 GeV.

While the momenta of charged particles require a magnetic field to be measured, the energies of neutral and charged particles can be measured with the CMS electromagnetic and hadronic calorimeters without a magnet. Therefore, although challenging, it is still possible to use data collected without a magnetic field through implementation of special and dedicated reconstruction and selection procedures. Photons are neutral particles, which do not bend in the magnetic field, and their energies are measured with a precision better than 1.5% using the CMS lead-tungstate crystal electromagnetic calorimeter. For the 0 T data set, the energy scale and resolution of the electromagnetic calorimeter were carefully cross-checked and adjusted using electrons from Z-boson decays. The momentum information normally used for the vertex assignment and isolation criteria was substituted at 0 T by track-counting, as was previously done by CMS in the summer of 2015 for the very first publication on the 13 TeV data, which was a study of the hadron multiplicity without a magnetic field.

The inclusion of the 0 T data and the use of optimised calibrations improve the overall expected sensitivity for a narrow resonance at 750 GeV by about 20%. The new results still exhibit an excess at a mass around 750 GeV. The new local significance for a narrow resonance hypothesis is 2.8σ. When combined with the 8 TeV data set from Run 1, the largest excess is observed at 750 GeV with a local significance of 3.4σ, corrected to 1.6σ when accounting for the possibility of a signal appearing anywhere in the explored mass range. The analysis gives similar results for both spin-0 and spin-2 signal hypotheses.

Therefore, even after the final calibration and with slightly more data, an intriguing excess remains. Only additional data will tell us whether this is an early sign of new physics.

World’s most precise measurements and search for the X(5568) tetraquark candidate

LHCb

At the Rencontres de Moriond EW conference held at La Thuile (Italy) from 12 to 19 March, the LHCb collaboration presented new important results.

CKM γ-angle measurements. The parameters that describe the difference in behaviour between matter and antimatter, known as CP violation, are constrained in the so-called CKM, or unitarity, triangle. The angles of this triangle are denoted α, β and γ, and among these, γ is the least precisely known. The γ value of (70.9+7.1–8.5)° presented at the conference was obtained from a combination of many different LHCb measurements, and is the most precise determination of γ from a single experiment. One of the new analyses presented at the conference uses decays of charged B mesons into charmed D mesons and pions or kaons. In turn, the D mesons decay into various combinations of pions and kaons. The results show different rates of positive and negative B mesons, clearly indicating different properties of matter and antimatter.

Determination of the B0 oscillation frequency. A fascinating feature of quantum mechanics, in which the B0s, B0 and D0 particles turn into their antimatter partners, is called oscillation or mixing. LHCb physicists analysed the full Run 1 data sample of semileptonic B0 decays with charged D or D* mesons, and presented the most precise single measurement of the parameter that sets the B0-meson oscillation frequency to be Δm= (505.0±2.1±1.0) ns–1.

Non-confirmation of the X(5568) tetraquark candidate. Recently, the DZero collaboration at Fermilab reported the observation of a narrow structure, X(5568), in the invariant mass of the B0s meson and a charged-pion π (CERN Courier April 2016 p13), and interpreted it as a tetraquark candidate composed of four different quarks (b, s, u and d).

At the Moriond conference, the LHCb collaboration reported a result of a similar analysis using a sample of B0s mesons 20 times higher than that used by the DZero collaboration. The B0sπ invariant mass spectrum is shown in the figure, using the B0s mesons decaying into J/ψ and φ mesons or into Ds and π mesons. No structure is seen in the region around the mass of 5568 MeV (indicated by the arrow). Hence, the LHCb analysis does not confirm the DZero result. Using similar kinematic requirements applied by the DZero collaboration in their analysis, the ratio of the X(5568) to the B0s-meson production rate is found to be less than 1.6%, at 90% confidence level.

CALET sees events in millions

Just a few months after its launch and the successful completion of the on-orbit commissioning phase aboard the International Space Station, the CALorimetric Electron Telescope (CALET) has started observations of high-energy charged particles and photons coming from space. To date, more than a hundred million events at energies above 10 GeV have been recorded and are under study.

CALET is a space mission led by JAXA with the participation of the Italian Space Agency (ASI) and NASA. CALET is also a CERN-recognised experiment; the collaboration used CERN’s beams to calibrate the instrument, which was launched from the Tanegashima Space Center on 19 August 2015, on board the Japanese H2-B rocket. After berthing with the ISS a few days later, CALET was robotically extracted from the transfer-vehicle HTV5, operated by JAXA, and installed on the external platform JEM-EF of the Japanese module (KIBO). The check-out phase went smoothly, and after data calibration and verification, CALET moved to regular observation mode in mid-October 2015. The data-taking will go on for period of two years, initially, with a target of five years.

The first data sets are confirming that all of the instruments are working extremely well.

CALET is designed to study electrons, nuclei and γ-rays coming from space. In particular, one of its main goals is to perform precision measurements of the detailed shape of the electron spectrum above 1 TeV. High-energy electrons are expected to come from less than a few-thousand light-years from Earth, as they quickly lose energy travelling in space. Their detection might reveal the presence of nearby astronomical source(s) where electrons are accelerated. The high end of the spectrum is particularly interesting because it could provide a clue to possible signatures of dark matter.

The first data sets are confirming that all of the instruments are working extremely well. The event image above (raw data) shows the detailed shape of the development of a shower of secondary particles generated by the impact of a candidate electron with an estimated energy greater than 1 TeV. The high-resolution energy measurement is provided by CALET’s deep, homogeneous calorimeter equipped with lead-tungstate (PbWO4) crystals preceded by a high-granularity (1 mm scintillating fibres) pre-shower calorimeter with advanced imaging capabilities. The depth of the instrument ensures good containment of electromagnetic showers in the TeV region.

In the coming months, thanks to its ability to identify cosmic nuclei from hydrogen to beyond iron, CALET will be able to study the high-energy hadronic component of cosmic rays. CALET will focus on the deviation from a pure power law that has been recently observed in the energy spectra of light nuclei. It will extend the present data to energies in the multi-TeV region with accurate measurements of the curvature of the spectrum as a function of energy, and of the abundance ratio of secondary to primary nuclei – an important ingredient to understand cosmic-ray propagation in the Galaxy.

Fast radio bursts reveal unexpected properties

Two studies show that fast radio bursts (FRBs) have a richer phenomenology than initially thought and might originate in two different classes. While a group could, for the first time, pinpoint the location of a FRB and constrain baryon density in the intergalactic medium, a second study has found repeated FRBs from the same source, which cannot be of cataclysmic origin.

FRBs are very brief flashes of radio emission lasting just a few milliseconds. Although the first FRB was recorded in 2001, it was detected and recognised as a new class of astronomical events only six years later (CERN Courier November 2007 p10). It has been overlooked until re-analysis of the data searching for very short radio pulses. Since then, more than 10 other FRBs have been detected, and they all suggest very powerful events occurring at cosmological distances (CERN Courier September 2013 p14). Unlike for gamma-ray bursts (GRBs), there is a way to infer the distance via the time delay of the pulse observed at different radio frequencies. This delay increases towards lower radio frequencies and is proportional to the dispersion measure (DM), which refers to the integrated density of free electrons along the line of sight from the source to Earth.

The real-time detection of a FRB at the Parkes radio telescope has now made it possible, for the first time, to quickly search for afterglow emission, which has routinely been done for GRBs for more than a decade (CERN Courier June 2003 p12). Only two hours after the burst, the Australia Telescope Compact Array (ATCA) observed the field and identified two variable compact sources. One of them was rapidly fading and is very likely the counterpart of the FRB. This achievement is reported in Nature by a collaboration led by Evan Keane of Swinburne University of Technology in Australia and project scientist of the Square Kilometre Array Organisation.

What makes the study so interesting is that the precise localisation of the afterglow allowed identification of the FRB’s host galaxy and, therefore, via its redshift of z = 0.492±0.008, the precise distance to the event. With this information, the DM can be used to measure the density of ionised baryons in the intergalactic medium. The obtained value of ΩIGM = 4.9±1.3, expressed in per cent of the critical density of the universe, is in good agreement with the cosmological determinations by the WMAP and Planck satellites.

The second paper, also published in Nature, reports the discovery of a series of FRBs from the same source. A total of 10 new bursts were recorded in May and June 2015, and correspond in location and DM to a FRB first detected in 2012. This unexpected behaviour was found by Paul Scholz, a PhD student at McGill University in Montreal, Canada, sifting through data from the Arecibo radio telescope in Puerto Rico. The recurrence of bursts on minute-long timescales cannot come from a cataclysmic event, but is likely to be from a young, highly magnetised neutron star, according to lead author Laura Spitler of the Max Planck Institute for Radioastronomy in Bonn, Germany. It is likely that this FRB is of a different nature to other FRBs.

The status of the field is reminiscent of that of GRBs in the 1990s, with the first afterglow detections and redshift determinations in 1997, and the earlier understanding that soft gamma repeaters are distinct from genuine extragalactic GRBs, which are cataclysmic events like supernova explosions and neutron star mergers.

The ILC project keeps its momentum high

Résumé

Le projet de collisionneur linéaire international a toujours le vent en poupe

Cela fait trois ans que la communauté internationale qui planifie le Collisionneur linéaire international a publié son rapport de conception technique. Le Collisionneur linéaire international est une proposition de nouvel accélérateur de particules, qui ferait entrer en collision des électrons et leurs antiparticules correspondantes, des positons, à une énergie de 500 GeV. Ce projet est inscrit dans toutes les feuilles de route pour la physique des particules, mais il n’a pas encore été décidé, jusqu’à présent, s’il doit être construit ou non. En attendant, la R&D se poursuit sur des éléments essentiels des détecteurs et accélérateurs de pointe, notamment sur les aspects de la conception qui seraient fonction de l’endroit où la machine serait construite.

It’s been three years since the worldwide community of the International Linear Collider (ILC) published its Technical Design Report (TDR). The proposed new particle accelerator would smash electrons and their antiparticles, positrons, into each other at energies of 500 GeV. However, even though the ILC features on all particle-physics road maps worldwide, no decision has been taken so far as to whether or not it should be built. In the meantime, R&D continues on key aspects of the state-of the-art accelerator and detectors, with particular focus on those aspects of the design that depend on where the machine would be built. A proposed site exists, and, if it goes ahead, the machine would be built underneath the lush hills of a region in northern Japan called Kitakami, in Iwate province, some three hours north of Tokyo. The green light depends on commitments from and negotiations between many governments, notably the Japanese, which hasn’t yet confirmed its willingness to host the world’s next big particle-physics adventure.

The ILC is said to complement results from the LHC because of the different nature of its collisions. Whereas the LHC collides protons with protons, the ILC would collide electrons with their antiparticles, positrons, with the option of starting out as a Higgs factory at 250 GeV and upgrading to 1 TeV in other stages. The physics case has recently been summed up in a paper published in the European Physical Journal C: “Due to the collision of point-like particles the physics processes take place at the precisely and well-defined initial energy √s, both stable and measurable up to the per-mille level,” the paper states. The energy at the ILC is tunable, which allows precise energy scans to be carried out and permits kinematic conditions for the different physics processes to be optimised. In addition, the beams can be polarised: the electron beam up to about 80%, the positron beam up to about 30%. Due to all of these circumstances, it is possible to fully reconstruct the final states so that numerous observables such as mass and total cross-sections, but also differential energy and angular distributions, are available for data analyses. For more information, see Eur. Phys. J. C 2015 75 371 doi:10.1140/epjc/s10052-015-3511-9.

Precise, efficient and novel systems

The ILC would use superconducting radiofrequency technology to accelerate its particles. Some 16,000 1 m-long accelerating cavities made of pure niobium with an accelerating gradient of up to 35 MV/m are needed to get electrons and positrons up to speed. The final-focus system needs to be extremely precise and efficient if collisions at the design luminosity of 2 × 1034 cm–2 s–1 are to occur in the two detectors. The detectors – after planning, design and testing by universities from around the world, involving many students – will take turns to sit in the interaction point. A novel system called “push–pull”, where one detector is pushed into the interaction point while the other is pulled out so that one can take data while the other is being serviced, was devised in the course of the R&D work for the project’s TDR, published in 2013. Compared with the option of switching the beam between two separate interaction regions, this option managed to cut the estimated cost by a significant amount because it eliminated several kilometres of tunnel and some cubic metres of cavern digging in one go.

The TDR sets the estimated cost of the project at $7.8 billion plus 23 million man-hours. This includes all civil engineering, technology production, construction, the accelerator components, etc, but it does not include detectors, contingency, escalation or operation costs. “The basis of the final design and the future construction for the ILC project has been completed, and we’re basically ready to push the green button,” said then-ILC-director Barry Barish, who led the team of physicists and engineers from around the world who formed the Global Design Effort (GDE) from 2005 to 2013, and who took the project to a construction-ready stage. Three previous regional projects (NLC, JLC and TESLA) needed to be combined into the best and most cost-effective option. People were busy evaluating one option against others, coming up with new ones, checking compatibilities and keeping an eye on the cost. Despite some major setbacks along the way, the R&D work culminated in the TDR. But even though the maturity of the technologies would allow for the machine to be built tomorrow, tunnel-boring machines have to wait for the official green light.

With the publication of the TDR, the mandate of the GDE ended, and a new organisation was put in place: the Linear Collider collaboration, or LCC. Barry Barish returned to LIGO to find gravitational waves and Lyn Evans took over and united the friendly competitors, the ILC and the Compact Linear Collider (CLIC) study, under one organisational roof. Even though the two linear colliders have very different designs, there are still synergies to be exploited between them. Detector developers, for example, work closely together on such state-of-the-art parts like high-granularity calorimeters as part of the CALICE collaboration. These high-granularity calorimeters have, in fact, spun off to the LHC, and will be used in the CMS detector’s calorimeters for the high-luminosity upgrade.

Move from technology to diplomacy

Lyn Evans, former LHC project leader and director of the Linear Collider collaboration, founded in 2013, calls the process a move from technology to diplomacy. Together with his team of project and regional directors, he is busy facilitating negotiations between state officials from various countries to get the approval process under way. The process is slow, and requires many small steps and a large number of study groups and committees; while Japan needs reassurance from governments and funding agencies of potential future member-state countries, to take a decision to host, partner states would prefer to hear “Let’s go!” from Japan, before committing themselves to vast amounts of money and manpower. To break an impasse, discussions between political leaders from relevant countries and proactive approaches by scientists to the governments of their countries are under way. A decision is expected sometime around 2018.

Research and design work hasn’t stopped, though. The main focus is now on adapting the generic collider design to the specifications of the future site. For example, access shafts and tunnels have been adapted to the geology that exists at the site. With the help of a civil-engineering tool originally developed for the Future Circular Collider study, the interaction region has now shifted by a few kilometres so that detector parts can be lowered into the cavern vertically, rather than needing to be driven in on an inclined slope. Engineers are looking at the nearest port that would receive most of the huge accelerator and detector parts from around the world, and at the bridges that these components would need to cross. One might expect doubts, even fear, from the local community, but the contrary is the case: pro-ILC banners, drawings, bumper stickers and flags along roads are visible proof of the region’s support. Local governments have set up ILC promotion offices manned by international residents of Japan, who make sure that everybody in Kitakami not only knows about but also gives their blessing to the ILC. Hitoshi Yamamoto, professor at Tohoku University, tells of the support that he witnessed during a recent site visit of the civil-engineering group: a grandfather and granddaughter saw the group of researchers standing on a field and walked up to them. The group expected to be told to go away, but instead the grandfather pointed at his granddaughter, saying “Please try your best to build the ILC – for this child.”

A new international science project would undoubtedly bring benefits to the region, even though the global nature of the ILC would mean that components and parts would be built and tested in labs and universities around the world, and then shipped to their final destination, mirroring what was done for the LHC at CERN. Industrialisation is therefore a high-priority topic for the ILC community: getting 16,000 high-tech cavities built, tested and shipped halfway around the world isn’t obvious, and researchers are learning a lot from the European X-Ray Free-Electron Laser (European XFEL) currently under construction at DESY in Hamburg, Germany. This uses the same technology as the ILC over a length of some 2 km, providing a neat model for cavity and cryomodule serial production. The European XFEL, which started its life as a spin-off from the TESLA accelerator once planned at DESY, also using SCRF technology, employed two companies for cavity production and devised a complicated (but functioning) ballet of component production, transport, testing and integration between various production places, the companies, DESY, the French CEA laboratory Irfu in Saclay and CNRS lab LAL in Orsay. For the ILC, an order of magnitude more parts will have to be shipped around the world.

Redoubled international efforts

The International Committee for Future Accelerators (ICFA) has decided to continue the linear-collider organisation, and has extended the mandate of the LCC by a year. At the February meeting, ICFA reached a consensus that the international effort, led by ICFA, for an ILC in Japan should continue, and a subgroup has been formed to study the future of the linear-collider organisation and make a proposal for a new structure to be in place from 2017.

ICFA has traditionally been the committee to which the ILC effort reported its progress, the body that set up committees and boards, gave them their mandates and monitored developments. Its partner organisation, the Asian Committee for Future Accelerators (ACFA), met with the Asia-Pacific High Energy Physics Panel (AsiaHEP) in February, and decided to issue a statement about the ILC and the potential circular Higgs factory to be built in China, CEPC. About the ILC, the statement says: “AsiaHEP and ACFA reassert their strong endorsement of the ILC, which is in a mature state of technical development…In continuation of decades of worldwide co-ordination, we encourage redoubled international efforts at this critical time to make the ILC a reality in Japan.” About CEPC, it states: “We encourage the effort led by China in this direction, and look forward to the completion of the technical design in a timely manner.”

These statements mirror what the strategic road maps for the future of particle physics in the different regions have said: that the physics case for the ILC is “extremely strong” and that the “interest expressed in Japan in hosting the ILC is an exciting development” (P5, US). “There is a strong scientific case for an electron–positron collider, complementary to the LHC, that can study the properties of the Higgs boson and other particles with unprecedented precision and whose energy can be upgraded,” states the European Strategy in its fifth recommendation. “Europe looks forward to a proposal from Japan to discuss a possible participation.” Obviously all strategies give top priority to the continued operation of the LHC and its future upgrade for operation at higher luminosities, to ensure the exploitation of its full scientific potential, and recommend competitive neutrino programmes and priorities and the development of a post-LHC accelerator project at CERN with global contribution.

• For further details, visit www.linearcollider.org.

bright-rec iop pub iop-science physcis connect