The International Committee for Future Accelerators (ICFA) has issued a statement on the need for continuous and stable funding for large international science projects, such as the proposed International Linear Collider. This statement is a reaction to recent cuts in the science budgets of the UK and the US, and addresses governments and science funding agencies around the world.
In the statement, ICFA expresses its deep concern about the recent decisions in the UK and the US on spending for long-term international science projects. It points out that “frontier science relies increasingly on stable international partnerships, since the scientific and technical challenges can only be met by enabling outstanding men and women of science from around the world to collaborate, and by joining forces to provide the resources they need to succeed”.
The statement continues: “A good example is the proposed International Linear Collider. In order to advance the understanding of the innermost structure of matter and the early development of the universe, several thousand particle physicists and accelerator scientists around the world, during the past 15 years, have co-ordinated their work on developing the technologies necessary to make a linear collider feasible.
“In view of these tightly interlinked efforts, inspired and driven by the scientific potential of the linear collider, the sudden cuts implemented by two partner countries have devastating effects. ICFA feels an obligation to make policy makers aware of the need for stability in the support of major international science efforts. It is important for all governments to find ways to maintain the trust needed to move forward international scientific endeavours.”
The LHC is well known for its hundreds of superconducting magnet assemblies, but it will also use 154 normally conducting “warm” magnets. On 19 February, a celebration took place at CERN to mark the installation of the last of these magnets, which play a key role in shaping the course of the proton beams.
The LHC consists of a number of arcs with straight sections in between, where the magnetic intensity required to bend the beams is not as high as elsewhere, and warm magnets can be used in these places. One advantage is their robustness when exposed to radiation, which is important, for example, in the two long straight sections where collimators clean the beam by removing particles far from the central orbit. Using superconducting magnets here would be problematic, as secondary particles produced in the collimators could reach the magnets and cause a quench. Warm magnets also have the advantage of being more affordable than superconducting magnets and are easier to make and install, owing to their comparative simplicity. Nearly all of the warm magnets were made specially for the LHC and travelled great distances to reach CERN: 48 magnets have come from TRIUMF, Vancouver; 40 from the Institute for High Energy Physics in Protvino, near Moscow; and 65 from the Budker Institute of Nuclear Physics in Novosibirsk, Russia. Only one magnet had a previous existence. Originally built for the ISR in the 1970s, it is now being used in the ALICE experiment.
On 8 February, the LHCb collaboration succeeded in extracting data from an almost-complete set of detectors. The exercise was an important test run to flag up any problems in the experiment before the next commissioning week in March.
Sixty electronic boards that read out information from the triggered events were used during the tests, collecting data at a frequency of 100 Hz. As not all of the boards have been installed or commissioned yet, this represents only around 20% of the data that the detectors are capable of generating.
There were also indications that cosmic tracks were recorded in the calorimeter system and the outer tracker. Because LHCb is set up with the detectors aligned in vertical planes, it is not easy to record the tracks of cosmic rays. However, there are rare cosmic rays that travel almost horizontally at a rate of a fraction of a hertz through LHCb. The main goal of the commissioning week in March is to collect cosmic events in all detectors together. This will allow an initial alignment in time of all of the components and provide some tracks to be used by the reconstruction and alignment teams.
The ATLAS collaboration celebrated lowering the final large piece of the detector into the underground cavern on 29 February. The event marked a major milestone for the muon spectrometer group, as well as the final installation of large detector components below ground.
ATLAS is the world’s largest general-purpose particle detector, measuring 46 m long, 25 m high and 25 m wide. It weighs 7000 tonnes and consists of around 100 million sensors that will measure particles produced in collisions in the LHC. The first major piece was installed in 2003 and since then, many detector elements have journeyed down the 100 m shaft into the cavern. This last piece completes the gigantic puzzle.
Modest in scale compared with the large 25 m diameter muon spectrometer wheels, this last element is one of two pieces known as the “small” wheels, the first having descended into the cavern two weeks earlier. Both small wheels measure 9.3 m in diameter and weigh 100 tonnes, including massive shielding elements. Unlike the large wheels, whose assembly was completed in the cavern, they were assembled on the surface on CERN’s main site and then transported slowly and carefully by road transport across to the ATLAS cavern at Point 1 on the LHC ring.
The whole muon spectrometer system would cover an area equivalent of three football fields, and includes 1.2 million independent electronic channels. The small wheels, which complete the system, are closest to the interaction point in the small-angle region, next to the calorimeter and the two end-caps of the toroid magnet. Each small wheel consists of two discs placed one against the other and slotted together with a central cylindrical component. One wheel provides shielding, which will absorb particles other than muons, while the other acts as the support for the detection chambers. Three of the four different kinds of muon chamber found in ATLAS are fitted onto the small wheels, including trigger chambers and two types of precision chamber – cathode strip chambers and monitored drift tubes – which will allow the muon tracks to be reconstructed with a precision of 10 μm.
The ATLAS muon-spectrometer group comprises 450 physicists from 48 institutions, and includes members from China, France, Germany, Greece, Israel, Italy, Japan, the Netherlands, Russia and the US. They have spent more than a decade developing, planning and constructing the ATLAS muon spectrometer, with the shielding elements constructed in Armenia and Serbia.
Barry Barish likes a challenge. He admits to a complete tendency to go for the difficult in his research – in his view, life is an adventure. Some might say that his most recent challenge would fit well with a certain famous TV series: “Your mission, should you choose to accept it… is to produce a design for the International Linear Collider that includes a detailed design concept, performance assessments, reliable international costing, an industrialization plan, and siting analysis, as well as detector concepts and scope.”
Barish did indeed accept the challenge in March 2005, when he became director of the Global Design Effort (GDE) for a proposed International Linear Collider (ILC). He started in a directorate of one – himself – at the head of a “virtual” laboratory of hundreds of physicists and engineers around the globe. To run the “lab” he has set up a small executive committee, which includes three regional directors (for the Americas, Asia and Europe), three project managers and two leading accelerator experts. There are also boards for R&D, change control and design cost.
Barish operates from his base at Caltech, where he has been since 1962 and ultimately became Linde Professor of Physics (now emeritus). His taste for research challenges became evident in the 1970s, when he was co-spokesperson with Frank Sciulli (also at Caltech) of the “narrow band” neutrino experiment at Fermilab that studied weak neutral currents and the quark substructure of the nucleon. He later became US spokesperson of the collaboration behind the Monopole, Astrophysics and Cosmic Ray Observatory, which operated from 1989 to 2000 in the Gran Sasso National Laboratory (LNGS). The experiment did not find monopoles, but it set the most stringent upper limits so far on their existence.
In 1991 he also began to lead the design of the GEM detector for the Superconducting Super Collider project, together with Bill Willis of Columbia University. In October 1993, however, the US congress infamously shut down the project and Barish found himself in search of a new challenge. He did not have to look far, as Caltech was already involved in the Laser Interferometer Gravitational-wave Observatory (LIGO), conceived to search for effects even more difficult to detect than neutrinos. The project was already approved and just beginning to receive funding. Barish became principal investigator in 1994 and director of the LIGO Laboratory in 1997.
Here was an incredibly challenging project, Barish explains, that was “making the audacious attempt to measure an effect of 1 in 1021“. It has indeed achieved this precision, but has not yet detected gravitational waves. “Now it’s down to nature,” says Barish, who found the work on LIGO very satisfying. “There is no way I would have left it except for an exciting new challenge – and the ILC is certainly challenging.” He says that it was hard to move on, “but I felt I could make a difference”. Moreover, he adds: “The likelihood is that the ILC will be important for particle physics.”
At 72 years old, Barish does not expect to participate in the ILC – the earliest it could start up would be in the 2020s. “The plan is short term. The question was whether I could pull together a worldwide team to conceive of a design that will do the job,” he says. With no background in accelerator physics, Barish may not seem the obvious choice for the task. However, he points out that “coming in from the outside, not being buried in the forest, can be very useful”. In addition he believes that he is a good student, and that a good student can be a good leader: “If you do your homework, if the people you work with respect you, then it’s possible.”
An important factor in building the team behind the GDE is that there is not as much history of collaboration in accelerator physics as there is in experimental particle physics. Barish points out that many of the members of the accelerator community have met only at conferences. There has never been real collaboration on accelerator design, so the GDE is a learning process in more than one sense. There are also interesting sociological issues, as the GDE has no physical central location, and meetings usually take place via video and tele-conferencing. Barish likens his job as director to “conducting the disparate instruments in an orchestra”.
In February 2007, the GDE reached a major milestone with the release of the Reference Design Report (RDR) for a 31 km long electron–positron linear collider, with a peak luminosity of about 2 × 1034 cm–2s–1, at a top centre-of-mass energy of 500 GeV, and the possibility of upgrading to 1 TeV. The report contains no detailed engineering; it might state, for example, that a magnet is needed for a certain task, but it does not describe how to build it. The report also contains a preliminary cost estimate, of some $6700 m plus 13,000 person-years of effort.
The final goal will be to produce a strong engineering design, and to optimize costing to form a serious proposal. An appealing deadline is the 2010 ICHEP meeting in Paris. By then there should be results from the LHC that could justify the project. “The main job,” says Barish “is to design a good machine and move once it’s justified.”
In the meantime there is important R&D to be done. Two key areas concern the high-voltage gradient proposed in the machine – an average of 31.5 MV/m – and the effects of electron clouds. Electrons from the walls of the beam pipe cause the positron beam to blow up, thereby reducing the luminosity, and ultimately the number of events. The clouds decay naturally and cease to be a problem if there is sufficient time between bunches, but this reduces the collision rate. The conservative option to keep the rate high would be to have two positron rings to inject alternate pulses into the linac. However, this has huge cost implications so, as Barish says: “There is huge motivation to solve the problem.” One attractive possibility that needs further investigation involves grooving and coating the beam pipe, which could reduce the electron cloud a hundredfold.
However, just before the end of 2007, bad news on funding in both the US and the UK struck a major blow to the plan foreseen at the time that the RDR was released. The UK dealt the first strike, stating that it would “cease investment” in the project, while the US reduced funding for the ILC from $60 m to $15 m as part of a hastily agreed compromise budget for FY2008. Barish recalls the complete surprise of the congressional decision on a budget that President Bush had put forward in February 2007. “We went to bed as normal on Friday (14 December), and woke up on Monday to find the project axed out.”
The cuts in the two countries are both quantitatively and qualitatively different. In one sense the UK’s decision is more serious, as it appears to be a policy decision taken with no input from the community (see Particle physics in the UK is facing a severe funding crisis). Barish says that the main loss here to the GDE is in intellectual leadership. He hopes that continued funding in the UK for general accelerator R&D will mean that the project does not lose people that he says are irreplaceable. In contrast, he expects to see the R&D for the ILC revived in the US budget for FY2009 (starting October 2008), albeit at a level lower than the $60 m originally promised for FY2008. Here the problem is how to cope with the loss of people over the coming months, as there is no funding left to support them in the current budget. Where it hurts most, says Barish, is that the US will not be able to develop the same level of home-grown expertise in the technology required for the ILC, compared with Japan or Europe.
A revival of the ILC in the US budget was a key assumption when Barish and the GDE executive committee met for a relatively rare face-to-face meeting at DESY on 12 January to formulate a new plan. At least the collaboration that Barish has forged is “strong enough to give us the ability to adjust and move on, even with reduced goals”. The aim of the new plan that has emerged is to reduce the scope of the R&D work, but maintain the original schedule of completion by 2010 for items with the highest technical risk, while stretching other parts of the programme to 2012.
The work on high-gradients, underway globally, and tests at Cornell University on reducing the electron cloud will remain high priorities for part one of the newly defined Technical Design Phase, to be ready for 2010. Part two, which will focus on the detailed engineering and industrialization, should be ready by 2012.
Looking further ahead, Barish acknowledges that an ILC-like machine could be the end of the line for very high-energy accelerators, but he points out that accelerators for other applications have a promising future. The GDE itself is already providing an important role in teaching accelerator physics to a new generation. “There is no better way to train them than on something that is pushing the state of the art,” he says. In fact, he sees training as a limiting factor in breeding new experts – whether young people or “converts” from other areas of physics, as many accelerator physicists now are. One problem that he is aware of is that “accelerator people are not revered – but they should be!”.
Despite the recent setbacks with the GDE, Barish remains determined to achieve his mission. “In these ambitious, long-range projects you are going to hit huge bumps in the road, but you have to persevere,” he says. What is vital in his view, is that the agenda should remain driven by science, and that this alone should determine if and when the ILC is built on the firm foundations laid by the GDE. Let us hope that those who fund particle physics have the vision to ensure that one day he can say: “Mission accomplished.”
With the completion of two major installation projects, nearly all the infrastructure for the ALICE experiment is now in place in the cavern at Point 2 on the LHC ring, near St. Genis-Pouilly in France. Further round the ring, at Point 5 near Cessy, the final large piece has descended into the cavern for the CMS detector, which is the first of its kind to have been constructed above ground before being lowered piece by piece into the cavern below.
In the ALICE cavern, the electromagnetic calorimeter support structure and “mini” space frame – the device for connecting service networks to the inside of the detector – went into position at the end of 2007. At 7 m high and with a weight of 30 tonnes, the calorimeter support structure consists of two austenitic stainless steel “half-shells”, welded and bolted together to give the structure its specially designed curved form. Once the calorimeter is in place, the support will have to bear nearly three times its own weight.
Lowering the structure and positioning it within the ALICE detector constituted major technical challenges for the ALICE installation team, who made numerous preparatory tests, and had to dismantle two walkways to free up the passage into the cavern. They then inserted the structure inside the detector, sliding it in between the face of the magnet and the metal space-frame that bears detector systems in the centre of ALICE. At one point there was a clearance of only a couple of centimetres between the moving structure and the magnet.
Two weeks later, the final big piece of the jigsaw went into place – the “mini” space frame, which resembles a giant socket outlet and weighs 14 tonnes. The device is almost 10 m long and carries all the supply cables for the services required for detector operation (gas, water, electricity, signals). Sitting straight across from the magnet, it connects the inner detector with the outside world.
Around the LHC ring at Point 5, the CMS Collaboration has celebrated the descent of the 15th and final large element of the detector into the experiment’s cavern (see CMS starts underground). Weighing 1430 tonnes and asymmetrical in shape, the end cap designated YE+1 is the largest of the three pieces that form one end of the detector. After the lowering operation, the piece was temporarily stowed as far as possible from the central components to leave room to cable the central tracker (see CMS installs the world’s largest silicon) and install the beam pipe.
Once these components are all in place CMS will be almost ready. All that will remain will be to seal all the components of the detector, and perform the final tests with cosmic rays and the magnet fully powered to 4 T.
by Frank Close, Oxford University Press. Hardback ISBN 9780199225903, £9.99 ($19.50).
This is a small book – you can read it in an evening – about the intriguing subject of “nothing”. Close takes us through history from the earliest philosophers, who concluded that “Nature abhors a void”, through the period of arguments about the non-existent Ether, up to the present time, where the void is considered to be a seething quantum-mechanical foam. He describes how the concepts of space and time are linked to the different ideas of “the Void” and ends with current speculations: maybe our entire universe is a quantum fluctuation with near-zero total energy. I learnt that the different forces are influenced by the structure of the void and that some constants of nature may be the random result of spontaneous symmetry breaking, both of which added to my very tenuous non-grasping of the Higgs question.
So far so good. Fortunately, I had already read a number of texts around the subject, for some passages are difficult to grasp because of the sometimes ungrammatical sentences – page 35 gets my all-time prize for totally confusing the reader.
It is unclear to me who the target audience is; the level required to understand the text ranges widely depending on the chapter. Close sometimes uses advanced concepts without explanation and has to rely on more than a little familiarity with the mysteries of quantum mechanics. As with most popular books of this type, those mysteries remain whole, although I must say in favour of The Void that it manages, for once, to leave out Schrödinger’s cat.
I also wonder if the text has been proof-read. Here are just two examples from too large a set: though Close was once head of communications and public education at CERN, he tells us that CERN started in 1955 (it was 1954) in a sentence that cannot be parsed in any language. I share some of his criticisms of CERN’s exhibition centre, but find it difficult to accept that for an entire page he uses “La Globe”, when the correct French is “le Globe” as can be read on CERN’s public website. Did no-one spot this? Fortunately for the author, but unfortunately for the publishing business, this book is not alone in being the victim of such sloppiness.
So, The Void is well worth reading; then send in your corrections.
A premature end to SLAC’s B-factory, a stop to UK investment in the International Linear Collider (ILC) project and more than 300 lay-offs at Fermilab and SLAC – 2007 ended on a bad note. Do these cuts in the US and the UK signal a general downturn for particle physics? No! In the UK they are the combined result of organizational changes and an emphasis on national facilities(see Particle physics in the UK is facing a severe funding crisis). In the US they arose from disputes in congress; and there at least, the US president’s budget for FY2009 looks more positive. The reasons for these cuts are thus too specific to call them a trend – all the more since, for example, KEK’s five-year plan strongly endorses ILC research and funding has increased recently in countries such as Germany.
We have seen frequent ups and downs in funding over the decades. So is it business as usual? Not quite. Nowadays, these ups and downs must be seen in the framework of global co-operation and the interdependence of projects in particle physics.
The size and cost of our facilities are so large that they can only be realized in a truly international context: a machine like the LHC will exist only once in the world. Equally, it would be inefficient to clone billion-euro projects for future neutrino physics, super B-factories or astroparticle physics. Moreover, the R&D necessary for high energies and intensities for future accelerators – and for future detectors – can only be performed in a stable and organized worldwide effort.
In theory, everyone agrees that a global distribution of responsibilities is the most cost-effective approach, allowing particle physics to make the best use of worldwide interests and expertise, and guaranteeing a broad and complementary exploration of our field. Making this a reality, however, is another business. It requires agreements at a transnational level and, in particular, reliability and continuity of support.
Here we evidently have a problem: although international in character, funding of high-energy physics projects is, and will be, largely national. There are few internationally binding treaties like the one for CERN, generating a stable financial situation. Most agreements are memoranda of understanding or even less formal. Common goals are subject to the “good will” of national funding and therefore to changing economical situations and national political and scientific priorities. Particularly vulnerable are the projects that require significant R&D without clearly defined financial contributions.
We can only progress if we repeatedly make it clear what it is we give back to society in return.
In addition there is the mere fact that supporting national facilities is politically easier than financing international ones. Which representative would lobby for a project that is not in their country or constituency? Surely it is better to cut the ILC than the local research facility.
Consider the example of the ILC more closely. The consensus is that it will be the next big machine, and that there will be only one machine. Even if the ILC is not imminent, R&D is mandatory to optimize costs and come to a technically sound proposal. Following this ideal, the worldwide community formed a global network and began to develop special expertise (see Barry Barish and the GDE: mission achievable). The UK, for example, was leading the effort on damping rings, beam delivery and positron sources. Ceasing support for ILC R&D in the UK therefore cuts a large hole in the international network. Who can take over – and at what cost? Yes, stopping R&D saves money in the short term, but in the long term it will cost more. Maybe even more damaging, a loss of confidence in pursuing projects internationally could result.
What can we conclude? Well, the first point is rather trivial: particle physics is part of society. We are not free from general economic constraints and we have to compete with important social, political, ecological and scientific goals. We can only progress if we repeatedly make it clear what it is we give back to society in return.
However, we really need a more organized way of setting internationally agreed priorities, with more binding definitions of national responsibilities and financial commitments for large-scale projects, including their R&D phase. The CERN Council strategy group, together with the funding agencies in the CERN council, is an important tool and should be a step towards a transcontinental equivalent. Note however: even this is no guarantee for reliability, as is evident from the termination of US contributions to the ITER project.
CERN, as any other laboratory hosting a large-scale facility, should see itself as an important part of a large network, serving the interests of its members and contributing states. Universities and national laboratories should not be seen as an appendix, but as key participants. We should work actively to make it evident in all countries that contributing to CERN eventually feeds back into domestic technological and scientific progress.
An excellent and successful LHC project is key to further international co-operation in particle physics. If nature reveals new effects and causes public excitement, many problems we face now will be easier to solve.
Hadron physics investigates one of the open frontiers of the Standard Model: the strong interaction for large gauge couplings. Experimentally, there are currently two major strategies. Precision experiments study symmetries and their violations with the aim of extracting fundamental quantities of QCD, such as the quark masses. Studies of the excited states and their decays, on the other hand, try to establish the ordering principles of the hadronic spectra to shed light on the problem of the confinement of the quarks.
The common aspects in both the charmed sector and the light quark sector were the major reason to bring together 350 experts from high-energy physics and nuclear physics to the 11th International Conference on Meson–Nucleon Physics and the Structure of the Nucleon (MENU 2007), which took place on 10–14 September 2007 at the Research Centre Jülich. The plenary sessions provided a broad review of the field, while invited and contributing speakers covered special topics, such as spin physics, meson and baryon spectroscopy, lattice calculations and in-medium physics, in five parallel sessions.
The light quark sector
Jürg Gasser, of the University of Bern, opened the conference with a review talk on chiral effective field theory, the standard tool for hadron physics in the threshold region. Lattice calculations have come into contact with chiral perturbation theory (χPT) by obtaining values for the low-energy constants l3 and l4. The DIRAC and NA48 experiments at CERN have tested the predictions of χPT by studying the level shifts of pionium and the decay of charged kaons into three pions. Rainer Wanke, of Mainz University, reported on the recent high-statistics data from NA48/2. The data have allowed the extraction of the S-wave pion–pion scattering length with great precision from studies of the Wigner cusp in the two-pion subsystem, as Ulf-G Meissner and collaborators predicted in 1997. The results agreed with the χPT predictions after inclusion of isospin-breaking effects.
Johan Bijnens of Lund University emphasized in his review on η physics that the decays of both the η and the η’ mesons are good laboratories to study non-dominant strong interaction effects. The slope parameter α in the neutral three pion decay of the η is a puzzling challenge, as χPT does not explain the sign of the slope parameter, even when pushed to next-to-next-to-leading order, while non-perturbative approaches do. Magnus Wolke of the Research Centre Jülich showed Dalitz plots for the decay of the η into three neutral pions, which the WASA-at-COSY Collaboration obtained in the first production run in April/May 2007. The WASA detector was transferred from Uppsala to Jülich in 2005. Cesare Bini of the Sapienza Università di Roma reviewed recent KLOE results featuring the η mass, measurement of η–η’ mixing, the slope parameter of the η decay and results on the scalar mesons ƒ0(980) and α0 (980) seen in φ-decay. Patrick Achenbach of Mainz showed the first results on η’ decays into η and two neutral pions from the CB-TAPS experiment at the MAMI-C electron accelerator at Mainz. Catalina Curceanu presented the recent progress on kaonic hydrogen by the SIDDHARTA collaboration at the DAΦNE facility, which will allow physicists to obtain the antikaon–nucleon scattering lengths.
Effective field theory is beginning to make an impact on traditional nuclear physics with a consistent treatment of two-nucleon and three-nucleon interactions. Theorists have for many decades considered three-nucleon forces as a possible explanation for the unsolved problem of the saturation properties of nuclear matter. Kimiko Sekiguchi of RIKEN, Stanislaw Kistryn of the Jagiellonian University Krakow, and Daniel Phillips of Ohio University showed how the possibilities of studying polarized proton–deuteron reactions provide a direct experimental access to the three-nucleon force. In addition, the progress in applying lattice methods to study hadrons, hadron–hadron interactions and eventually nuclei, figured in the talks by Silas R Beane of the University of New Hampshire, Uwe-Jens Wiese of the University of Bern, and Andreas Schäfer of the University of Regensburg.
The charm sector
The decay of heavy mesons produced by the present generation of electron–positron colliders sheds new light on the light meson sector because the scalar mesons ƒ0(980) and α0(980) are found in the decay products, for example in the reaction D0 → K0π+π–, as Michael Pennington of Durham University stressed in his talk. Joseph Schechter of Syracuse University presented effective Lagrangian methods for the light scalar meson sector. The new charmed mesons Ds(2317) and Ds(2460), together with the charmonia-like states X, Y and Z, can be considered as unexpected contributions from B-factories, as Ruslan Chistov from ITEP Moscow pointed out in his overview of results from the Belle experiment at KEK. B-decays suppress the background contributions and offer large branching fractions, thus allowing an angular analysis to obtain quantum numbers. Walter Toki of Colorado State University discussed the recent results on the X(3872) and Y(3940) mesons from the BaBar experiment at SLAC and on the Z(4430) discovered by Belle, which apparently do not fit into the conventional quark–antiquark model for mesons. The Z(4430) may be a hadronic molecule made of a D*(2010) and a D1(2420) or a tetraquark, and, if confirmed, would be as exciting as the first charged hidden-charm state.
Matthias Lutz from Gesellschaft für Schwerionenphysik, Darmstadt, and Craig Roberts of Argonne discussed various aspects of hadron spectroscopy, while Ulrich Mosel of the University of Giessen highlighted recent theoretical progress in modelling the medium-dependence of nucleon resonances. Ulrike Thoma of the University of Bonn reported on evidence for two new Baryons – a D15(2070) and D33(1940), seen in η production on the nucleon at the ELSA facility at Bonn. Bing-Song Zou, of the Chinese Academy of Science, Beijing, observed that J/ψ decay is an ideal isospin filter for studying baryons, allowing the identification of the elusive Roper resonance as a visible bump, quite in contrast to pion–nucleon scattering. The Roper resonance is the first excited state of the nucleon with the quantum numbers of the nucleon. Results from the Beijing Spectrometer experiment show a surprisingly small Roper mass of 1360 MeV.
Haiyan Gao of Duke University showed how quark–hadron duality studies in charged pion photoproduction can be used to obtain information about resonances in the energy region above 2 GeV. Kai Brinkmann of Technical University Dresden reviewed results from the cooler synchrotron COSY at Research Centre Jülich, in particular the negative result for the search for pentaquarks, while Takashi Nakano reported on the recent status of experiments at the SPring-8 synchrotron radiation facility in Japan. Mikail Voloshin, of Minnesota, reviewed the decay of charmed hadrons and pointed out the open opportunities to improve our knowledge of the Kobayashi–Maskawa matrix element Vub. Ikaros Bigi, of Notre Dame du Lac, focused on D0 oscillations, which open a unique window on flavour dynamics.
The future will see exciting new machine developments. Naohito Saito discussed progress at the new Japan Proton Accelerator Research Complex, while the European project for the Facility for Antiproton and Ion Research was covered by Paolo Lenisa from the Università di Ferrara, Mauro Anselmino of INFN Torino and Johan Messchendorp of KVI Groningen. Anthony Thomas presented the 12 GeV upgrade for the Continuous Electron Beam Accelerator Facility (CEBAF) at Jefferson Lab, and Günther Rosner of the University of Glasgow gave an overview of physics with the CEBAF Large Acceptance Spectrometer.
Willem Van Oers of Manitoba University gave a lively address as the representative of the International Union of Pure and Applied Physics, who together with Forschungszentrum Jülich, Deutsche Forschungsgemeinschaft, Jefferson Lab, and the Hadron Physics I3 FP6 European Community Programme made this conference possible.
• The next MENU conference will be held in two years’ time in Newport News, Virginia, in 2010.
In July 2006, the huge segments of the CMS detector came together for the first time for the Magnet Test and Cosmic Challenge at the experiment’s site near Cessy in France. Within days the commissioning teams were recording data from cosmic rays bending in the 4 T magnetic field as they passed through a “slice” of the overall detector. This contained elements of all four main subdetectors: the particle tracker, the electromagnetic and hadron calorimeters and the muon system (figure 1). Vital steps remained, however, for CMS to be ready for particle collisions in the LHC. These tests in 2006 took place at the surface, using temporary cabling and a temporary electronics barrack, 100 m or so above the LHC ring.
To prepare for the LHC start up, the segments had to be lowered into the cavern one at a time, where the complete system – from services to data delivery – was to be installed, integrated and checked thoroughly in tests with cosmic rays. The first segment – one of the two forward hadron calorimeters (HF) – descended into the CMS cavern at the beginning of November 2006 and a large section of the detector was in its final position little more than a year later, recording cosmic-ray muons through the complete data chain and delivering events to collaborators as far afield as California and China.
This feat marked an achievement, not only in technical terms, but also in human collaboration. The ultimate success of an experiment on the scale of CMS is not only the challenge of building and assembling all the complex pieces; it also involves orchestrating an ensemble of people to ensure the detector’s eventual smooth operation. The in situ operations to collect cosmic rays typically involved crews of up to 60 people from the different subdetectors at CERN, as well as colleagues around the globe who have put the distributed monitoring and analysis system through its paces. The teams worked together, in a relatively short space of time, to solve the majority of problems as they arose – in real time.
Installation of the readout electronics for the various subdetector systems began in the cavern in early 2007, soon after the arrival of the first large segments. There were sufficient components fully installed by May for commissioning teams to begin a series of “global runs” – over several days at the end of each month – using cosmic rays to trigger the readout. Their aim was to increase functionality and scale with each run, as additional components became available. The complete detector should be ready by May 2008 for the ultimate test in its final configuration with the magnetic field at its nominal value.
At the time of the first global run, on 24–30 May 2007, only one subdetector – half of the forward hadron calorimeter (HF+), which was the first piece to be lowered – was ready to participate (figure 2). It was accompanied by the global trigger, a reduced set of the final central data acquisition (DAQ) hardware installed in the service cavern, and data-quality monitoring (DQM) services to monitor the HF and the trigger. While this represented only a small fraction of the complete CMS detection system, the run marked a major step forward when it recorded the first data with CMS in its final position.
This initial global run confirmed the operation of the HF from the run-control system through to the production global triggers and delivery of data to the storage manager in the cavern. It demonstrated the successful concurrent operation of DQM tasks for the hadron calorimeter and the trigger, and real-time visualization of events by the event display. The chain of hardware and software processes governing the data transfer to the Tier-0 processing centre at CERN’s Computer Centre (the first level of the four-tier Grid-based data distribution system) already worked without problems from this early stage. Moreover, the central DAQ was able to run overnight without interruption.
The June global run saw the first real cosmic-muon triggers. By this time, the chambers made of drift tubes (DTs) for tracking muons through the central barrel of CMS were ready to participate. One forward hadron calorimeter (HF) plus a supermodule of the electromagnetic calorimeter (which at the time was being inserted in the CMS barrel) also joined in the run, which proved that the procedures to read out multiple detectors worked smoothly.
July’s run focused more on upgrading the DAQ software towards its final framework and included further subdetectors, in particular the resistive plate chambers (RPCs) in the muon barrel, which are specifically designed to provide additional redundancy to the muon triggers. This marked the successful upgrade to the final DAQ software architecture and integration of the RPC system.
The first coincidence between two subsystems was a major aim for the global run in August. For the first time, the run included parts of the barrel electromagnetic calorimeter (ECAL) with their final cabling, which were timed in to the DT trigger. The regular transfer of the data to the Tier-0 centre and some Tier-1s had now become routine.
By September, the commissioning team was able to exercise the full data chain from front-end readout for all types of subdetector through to Tier-1, Tier-2 and even Tier-3 centres, with data becoming available at the Tier-1 in Fermilab in less than an hour. The latter allowed collaboration members in Fermilab to participate in remote data-monitoring shifts via the Fermilab Remote Operations Centre. On the last day of the run, the team managed to insert a fraction of the readout modules for the tracker (working in emulation mode, given that the actual tracker was not yet installed) into the global run, together with readout for the muon DTs and RPCs – with the different muon detectors all contributing to the global muon trigger.
The scale of the operation was by now comparable to that achieved above ground with the Magnet Test and Cosmic Challenge in the summer of 2006. Moreover, as synchronization of different components for the random arrival times of cosmic-muon events is more complex than for the well timed collisions in the LHC, the ease in synchronizing the different triggers during this run was a good augur for the future.
The global-run campaign resumed again on 26 November. The principal change here was to use a larger portion of the final DAQ hardware on the surface rather than the mini-DAQ system. By this time the participating detector subsystems included all of the HCAL (as well as the luminosity system), four barrel ECAL supermodules, the complete central barrel wheel of muon DTs, and four sectors of RPCs on two of the barrel wheels. For the first time, a significant fraction of the readout for the final detector was taking part. The high-level trigger software unpacked nearly all the data during the run, ran local reconstructions in the muon DTs and ECAL barrel and created event streams enriched with muons pointing to the calorimeters. Prompt reconstruction took place on the Tier-0 processors and performed much of the reconstruction planned for LHC collisions.
To exercise the full data chain, the November run included a prototype tracking system, the “rod-in-a-box” (RIB), which contained six sensor modules of the strip tracking system. The experience in operating the RIB inside CMS provided a head-start for operation using the complete tracker once it is fully installed and cabled in early 2008 (see CMS installs the world’s largest silicon detector). The team also brought the final RPC trigger into operation, synchronizing it with the DT trigger and readout.
Installation in the CMS cavern continued apace, with the final segment – the last disc of the endcaps – lowered on 22 January 2008 (figure 5). The aim is to have sufficient cabling and services to read out more than half of CMS by March, including a large fraction of the tracker.
All seven Tier-1 centres have been involved since December, ranging from the US to Asia. In April, this worldwide collaboration will be exercised further with continuous 24 hour running, during which collaboration members in the remote laboratories will participate in data monitoring and analysis via centres such as the Fermilab ROC, as well as in a new CMS Centre installed in the old PS control room at CERN. By that stage, it will be true to say that the sun never sets on CMS data processing, as the collaboration puts in the final effort for the ultimate global run with the CMS wheels and discs closed and the magnet switched on before the first proton–proton collisions in the LHC.
• The authors are deeply indebted to the tremendous effort of their collaborators in preparing the CMS experiment.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.