Only four months after the crates containing parts of the new electron linac system touched ground at the test site, the National Synchrotron Radiation Research Center (NSRRC) completed the commissioning of the linac system for its second synchrotron light source facility, the Taiwan Photon Source (TPS), on 6 May.
This is a major milestone in a challenging project that has had to contend with waves of worldwide recession, turbulent inflation, price fluctuations for raw materials, and unforeseen obstacles in civil construction. The contract to design and manufacture the single turn-key system, with a minimum output energy of 150 MeV, was awarded to RI Research Instruments GmbH, Germany, in November 2008. The basic design parameters stipulated 2997.924 MHz (S band) radio frequency, pulse duration of 1 ns and 200–1000 ns for short and long pulses respectively, and a maximum repetition rate of 5 Hz. The linac consists of an electron gun, a focusing and bunching section, and an accelerating section. The electron gun is capable of providing pulsed nanosecond electrons with an energy of 90 keV. Once produced, the electrons are further focused, longitudinally bunched and transferred to the 20-m long-accelerating section. This has three acceleration structures equipped with three high-power microwave amplifiers, in which the electrons are accelerated to 150 MeV.
The linac is the first TPS subsystem to require many supporting subsystems in order to proceed in its testing. During the development, various unexpected situations were encountered and steps were taken to deal with them; these included the availability of the test site ahead of the completion of the building for the TPS storage ring. The establishment of a full-scale linac test site placed an additional workload on the TPS construction teams.
The construction of the linac test site and the process to obtain operating permission from the Atomic Energy Committee occupied the last four months of 2010. In January 2011, the TPS subsystem teams moved in and began working around the clock to set up the facility. Later they joined forces with the engineering teams from RI Research Instruments GmbH to install and test the linac. The effective collaboration between the different teams was a key factor in the smooth and successful commissioning of the linac. Having the TPS linac system up and running opens a number of channels for engineers to test their systems, especially those developed in house by the NSRRC staff including the control and instrumentation group.
The TPS will be equipped with a 3 GeV electron accelerator
The design of the linac system by the four-member linac team began in the spring of 2006 and by June 2008 it was made available for an open bid to vendors worldwide. In November 2008, the project was awarded to ACCEL Instruments GmbH (acquired by Bruker Energy & Supercon Technologies Inc in 2009). This gave RI Research Instruments GmbH, a spin-off division from ACCEL, 27 months to prepare the complete system for commissioning with beam. After the first parts arrived at the linac test site in January 2011, the TPS linac team geared up to prepare for the acceptance test. The intense schedule put linac staff on a 24-hour shift to work on the project, in particular on the high-power RF conditioning of the accelerating structures. This effort reached a conclusion on 5 May, with an output energy greater than 150 MeV and measurement of the major beam parameters.
The TPS will be equipped with a 3 GeV electron accelerator, 518.4 m in circumference, and a low-emittance synchrotron storage ring. As part of the major advanced scientific research projects under the National Science Council, the TPS is being built next to the NSRRC’s first light source, Taiwan Light Source (CERN Courier June 2010 p16). The first official proposal of the TPS was submitted to governmental authorities in 2006 and the official ground breaking of TPS civil construction began on 7 February 2010.
16 May 2011, 08:56 EDT : The Endeavour space shuttle takes off for its final trip into space. In its payload bay, it is carrying the Alpha Magnetic Spectrometer (AMS-02), a large particle detector. For the 600 physicists and engineers who have been working on the AMS project for more than 15 years, it is a fantastic accomplishment, but the real excitement is only just beginning. Two days after take off, Endeavour will reach the International Space Station (ISS), where AMS-02 is installed. From this vantage point in space, the advanced detector will catch the tracks of cosmic rays for years to come.
The Big Bang that gave rise to the universe about 13.7 billion years ago should have produced particles and antiparticles in equal amounts, as indeed happens in experiments at particle accelerators. So why is there no evidence of antimatter in the universe? This is one of the big questions of modern physics – one that the international AMS collaboration, led by Nobel laureate Sam Ting, is seeking to address. The mission will also join the search for dark matter in the universe and gather information on sources of cosmic rays.
Cosmic rays are energetic particles from outer space. Protons and the nuclei of helium together represent about 99% of the spectrum, while the remaining 1% is composed of heavier nuclei and electrons. The details of their origins remain unknown but they are probably produced in different cosmic objects, especially the most violent, such as supernova remnants. The sources must be powerful natural accelerators – more powerful than any achievable in laboratories on Earth.
AMS-02 is the first large magnetic spectrometer to fly into space for a long period of time, following on from its prototype, AMS-01, which flew aboard the space shuttle Discovery for 10 days in June 1998. It will allow the cosmic-ray spectrum to be measured with high precision. Such precise measurements are possible only from space, because the cosmic rays that bombard the Earth interact with the atmosphere. At the Earth’s surface it is possible to detect only the cosmic-ray showers that they produce, as the Pierre Auger Observatory in Argentina does, for example. The magnetic fields in space mean that it is impossible to deduce the location of any source of the primary charged particles by measuring their direction. However, determining the composition of the cosmic rays precisely is a key to knowing where they come from. All of the natural elements are present in cosmic rays in approximately the same ratio as in the Solar System, so detailed differences could reveal crucial information about the nature of their source – rather like fingerprints – and make it possible to identify the sources indirectly.
The requirements for operating in space are challenging and AMS-02 has been designed specially for this hostile environment. It has to resist vibrations and large temperature variations; it must operate in the vacuum of space and in the absence of gravity. In addition, the electronics have to be resistant to radiation. Other vital constraints also had to be considered in designing the detector, such as its weight and electrical consumption, both of which were strictly limited.
At the heart of AMS-02 is a powerful, permanent magnet that bends charged particles and antiparticles in opposite directions. This “magic ring” is designed specifically to ensure that the magnet has a negligible net dipole-moment, so avoiding a coupling with the Earth’s magnetic field that would otherwise disturb the orbit of the ISS. Around this magnet, several layers of detectors work together to identify particles passing through (figure 1, p19). The silicon tracker measures the trajectory deflection of charged particles, the ring-imaging Cherenkov (RICH) detector estimates their velocity and the electromagnetic calorimeter (ECAL) measures their energy. In addition, the transition radiation detector (TRD) identifies light particles by the detection of the X-rays that they emit. The time-of-flight (TOF) system acts as a trigger, alerting the subdetectors to an incident cosmic ray. An anti-coincidence counter was also developed to sort the events in real time, rejecting cosmic rays traversing the magnet walls and keeping the significant ones that really cross the overall detector. As a whole, AMS-02 is able to digitize 300,000 channels of data some 2000 times a second.
AMS-02 can recognize one antiparticle among a billion particles. This represents an increase in sensitivity of three orders of magnitude in comparison with previous experiments. With such precision, the detector will provide the composition of the cosmic-ray spectrum with unprecedented accuracy. This will enable the AMS collaboration to find either an explanation for the disappearance of antimatter or proof of its existence hidden away in a remote corner of the universe. The observation of just one antihelium nucleus would provide evidence for the existence of a large amount of antimatter somewhere in the universe – large enough for antiprotons to have undergone a process of nucleosynthesis. This matter could only have been generated soon after the Big Bang and would represent a real breakthrough in the current view of the universe.
Antiworlds and the dark universe
There are other puzzles about the universe that AMS can try to solve. Only 4% of the universe is accessible to telescopes and detectors through the radiation that it emits. The rest appears to be in the forms known as dark matter and dark energy, which account for about 23% and 73%, respectively, of the total matter and energy in the universe. In this case, AMS-02 is expected to play a key role by tracking possible signals from the annihilation of supersymmetric particles. One of the candidates for a dark-matter particle is the neutralino, a hypothetical particle that is predicted by supersymmetry. If neutralinos do indeed exist they could interact with each other, producing excesses of charged or neutral particles – creating anomalies in the overall cosmic-ray spectrum.
Particle physics, astroparticle physics and cosmology are certainly at a key moment in their history as AMS-02 enters the race
For this reason, AMS-02 has been eagerly awaited because it will probably be the only experiment able to confirm or invalidate results from other experiments that have recently been in the spotlight. In particular, two satellites, PAMELA and FERMI, as well as the ATIC balloon experiment flown above Antarctica, have all reported an excess of electrons and positrons in the cosmic-ray spectrum. Even though these different measurements are inconsistent with each other, they could all possibly fit with the scenario of dark-matter annihilation. Using a completely different approach, underground experiments such as Xenon, CDMS or Edelweiss, are currently providing strong competition in the detection of dark-matter particles. Moreover, in parallel the LHC is producing data that could in the next two years provide the first exciting news on dark matter.
Particle physics, astroparticle physics and cosmology are certainly at a key moment in their history as AMS-02 enters the race. “Better to light a candle than to curse the darkness,” goes a Chinese proverb. All eyes are now focused on the AMS-02 experiment, hoping that it will precisely light the candle on both the dark universe and the antiuniverse.
• AMS was built by an international collaboration involving large European participation from France, Germany, Italy, the Netherlands, Spain and Switzerland,together with China, Taiwan, Russia and the US. It has been supported by the national high-energy institutes INFN, IN2P3, CIEMAT, the US Department of Energy, Academia Sinica (Taipei), Swiss National Fund, and by the space agencies ASI, DLR, NASA, and ESA. The detectors were integrated at CERN by the collaborating groups. Space qualification was at the ESTEC facilities in ESA.
“Most of our understanding of our cosmos up to now comes from measuring light. Besides light rays, there are charged particles, which have not been used nearly as much as light to understand the universe.” Samuel Ting, the principal investigator for the Alpha Magnetic Spectrometer (AMS-02) experiment, is talking to the assembled press at the 2.00 p.m. briefing in the crowded auditorium at the Kennedy Space Center, Cape Canaveral. His experiment is soon to fly on board the space shuttle Endeavour, prior to installation on the International Space Station (ISS). “AMS is the first detector to study charged particles from cosmic rays directly in space, thanks to its magnet – the first magnet in space – and it will do so for the next 20 years, for the space station’s entire lifetime,” he explains. “It is the only fundamental science experiment in the space station.”
This was the main argument that Ting, well known as a leading particle physicist and Nobel laureate, used to convince the US Congress in 2008 to request that NASA “shall take all necessary steps to fly one additional space shuttle flight to deliver the Alpha Magnetic Spectrometer and other scientific equipment and payloads to the International Space Station prior to the retirement of the space shuttle” (US government 2008). AMS-02 had been grounded in 2005, in response both to the accident of the space shuttle Columbia on re-entry in 2003 and to the decision to retire the shuttle by late 2010. Were it not for its primary payload, the 2011 launch of Endeavour might never have been scheduled.
Unexpected discoveries
The principal scientific goals of AMS-02 are to search for dark matter and antimatter (AMS: the search for exotic matter goes into space). However, AMS’s biggest discovery might come in a totally unexpected area. Ting points out that the major discoveries in particle physics over the past 50 years were made in areas of physics that had not been anticipated when building the facilities where the discoveries were made. Take, for example, neutral currents, which Ting regards as the first major discovery made at CERN’s Proton Synchrotron. “When the Proton Synchrotron was built nobody thought about neutral currents,” he says. Likewise for the Brookhaven National Laboratory, which built the Alternating Gradient Synchrotron (AGS) during the same period. “The purpose was to study the nuclear force; instead it discovered a second kind of neutrino, CP violation and the J particle,” he explains. “So what you predict and what you discover are often very different things. With AMS, we are going to explore new territory with a precision instrument, and that is the key to discovery. What we will really see, nobody knows. This is how science advances.” This philosophy was also a key to his Nobel-prize-winning experiment at the AGS, which opened up a new world of particle physics based on a fourth kind of quark, charm, in what has become known as the “November revolution” of 1974.
The story of the AMS experiment goes back nearly 20 years, to the cancellation of the Superconducting Supercollider project in 1993. It was then that Ting first had the idea to send a relatively large-scale particle detector into space. “I began to think maybe I should do something different, not necessarily with accelerators, and then I remembered that in early 1964 I did an experiment together with Professor Leon Lederman to show how an antiproton and an antineutron form an antideuterium. A similar experiment was also done by Professor Zichichi’s group at CERN. So I began to think, maybe I should do an experiment in space. In the 1990s, together with a group of colleagues, we saw the ISS as an opportunity to mount an experiment to study cosmic rays. With support from NASA and the US Department of Energy, an international consortium started work on AMS, and we flew a precursor instrument on the STS-91 shuttle mission in 1998.”
The key component of AMS-01 was the magnet, as in its successor, but the detector was much simpler. “It was intended as a proof of principle, proof that a magnet could go to space and it did so by flying 10 days on the space shuttle Discovery,” explains Ting. The test flight in June 1998 not only showed that everything worked, but also made some initial intriguing measurements of cosmic rays in space. This provided the ground work for AMS-02, which is intended to operate for 20 years. “It has greater, more precise subdetectors, with many channels, whose size, scope and precision are totally different from AMS-01,” he says. “We made them as precise as we could manage.”
So how does Ting feel to see AMS-02 finally being about to launch after the long journey that began with AMS-01? “I am actually very calm; I am confident everything will be OK. This detector spent two decades in the workshop: at CERN, we tested the detector twice with a beam from the SPS accelerator, then we tested it in the thermovacuum chamber at ESA-ESTEC. We took it apart and re-assembled it three times, so we’re quite familiar with what’s going on inside. All of the subdetectors measure energy in a repetitive way, so I think everything will work.”
It is not surprising that CERN’s facilities were used in testing AMS-02. Ting’s relationship with the laboratory goes back nearly half a century, his first day at CERN being on 13 March 1963, as a Ford Foundation Fellow. “There, I had the good fortune to work with Giuseppe Cocconi at the Proton Synchrotron, and I learnt a lot of physics from him,” he recalls. Particle physics and CERN have certainly both evolved a great deal since then. “When I first came to CERN, high-energy physics was dominated by the US,” he says. “Most people at CERN were looking at what was done at Brookhaven and tried to do similar experiments. Now the picture has completely changed. Most US particle physicists come to CERN, and CERN now really has become the centre of high-energy physics in the world.”
With all of the current interest in CERN and particle physics, Ting has some serious, practical advice for young people aspiring to become physicists. “If you want to be a scientist, whether it is a physicist, mathematician or biologist, you need to remember that you’re doing this only for interest, not for fame or glory, because only very few people in their lifetime accomplish what they really want,” he explains. “Physics is a very difficult thing; particle physics involves large groups of people working together. Unless you think that physics is the most important thing in your life, you should not do it. It takes passion, precision, patience.”
Patience is a quality that Ting certainly has, waiting for this “last Endeavour” for his AMS project and never giving up hope. So when does he expect the first important results? “We have no competition,” he says. “We are going to do this very slowly, very carefully. We won’t publish any preliminary results; we’ll only publish the data that we’re absolutely sure about.” Whatever AMS discovers, the final answers, like much that Ting has achieved, will be the result of passion, precision and patience.
Thursday 28 April, LD–1. It’s launch-minus-one day at the Kennedy Space Center (KSC) in Cape Canaveral and so far it’s “go” for tomorrow’s 10-minute launch window at 3.47 a.m. EDT, the time set for space shuttle Endeavour’s final lift-off. I am one of 1500 members of the international press accredited at the KSC and one of the expected half-a-million viewers to witness the launch.
Much of the attention surrounding this mission has focused on the fact that this will be the final flight of Endeavour and the penultimate mission of the entire space-shuttle programme, as well as that the mission commander, Mark Kelly, is married to Congresswoman Gabrielle Giffords, who is recovering from a shooting more than four months ago. And, according to the latest rumours among the press at KSC, the “first family” is expected to attend tomorrow.
It’s T–11 hours and holding, one of the longest pauses (around 14 hours) built into the countdown procedure. I join the media registered for witnessing the removal of Endeavour’s Rotating Service Structure (RSS). This is one of the important milestones performed in the T–11 hold in the countdown. Around midnight, the metallic gantry around the shuttle starts to move away under the enthralled gaze of the press representatives who were brave enough to stay, revealing Endeavour in all of its splendour. Once the operation has been performed, there are still 11 hours and so many unknowns before lift-off. Moreover, the weather does not seem promising, with lightning threatening NASA’s Vehicle Assembly Building and launch pad 39.
Friday 29 April, T–3 hours and… scrub. After a short night (the RSS removal took place after midnight), we wake up early not to miss another milestone in the countdown schedule – the astronauts’ “walk-out” and departure for launch pad. We have to be early at the media centre for the usual “K-9” controls (dogs checking for explosives). On the way, I stop at AMS’s premises at KSC, which happen to be close to the Operations and Checkout building where all astronauts spend the night before launch, since the time of the Apollo missions. We see three of them jogging – their last chance for a while.
Walk-out takes place at 11.58 a.m. as planned. I barely manage to shout “Forza Roberto” to my compatriot Roberto Vittori, before my voice is drowned in the crowd of media and NASA staff cheering the STS-134 crew, as they proceed to the Airstream van (also used by all crews since Apollo times).
However, in the media bus taking us back to the press centre, we see the Airstream van backing up – a clear sign that something has gone wrong. At a press briefing we learn that, while the astronauts were on their way to the pad, the launch team identified a fault in the heaters of the auxiliary power unit that prevents the shuttle’s fuel from freezing. This is enough to scrub the launch window.
Sunday 15 May, T–11 hours. I’m back at KSC for the second launch attempt and the legendary countdown clock is again on T–11 hours and holding. The faulty box in the shuttle’s aft compartment that resulted in the launch postponement has been replaced and the entire system re-tested. The weather forecast for tomorrow’s slot is “70% go”. Countdown will resume soon.
Monday 16 May,T–9 minutes and counting. The Mission Management team has just given the final “go” for launch. In less than 9 minutes Endeavour will lift off with AMS cradled in its cargo bay. I am on the media-centre lawn, less than 5 km from the launch pad, one of the closest points to watch a launch at KSC. I’m grateful to Prof. Ting for the invitation. No words can convey the emotion; it’s a lifetime experience not to be forgotten.
Just before midnight on 21 April, the LHC set a new world record for beam intensity at a hadron collider when its beams collided with a peak luminosity of 4.67 × 1032 m–2s–1. This exceeds the previous world record of 4.024 × 1032 cm–2s–1, which was set by Fermilab’s Tevatron collider in 2010, and marks an important milestone in LHC commissioning. The new record, made with 480 bunches per beam, lasted only a couple of days before collisions with 768 bunches per beam delivered around 8.4 × 1032 cm–2s–1. By the time a period of machine development began in the first week in May, the integrated luminosity for ATLAS and CMS for 2011 had reached more than 250 pb–1.
Early in April, a period of “scrubbing” took place to improve the surface characteristics of the beam pipe. This run saw more than 1000 high-intensity bunches per beam circulating at 450 GeV with 50 ns spacing. Given the potential luminosity performance (more bunches, higher bunch intensity from the injectors), the decision was taken to continue the 2011 physics run with this bunch spacing.
For 50 ns injection into the LHC, the Super Proton Synchrotron (SPS) takes batches of 36 bunches from the Proton Synchrotron. Since the scrubbing run, the LHC has passed through 228, 336, 480 and 624 bunches per beam to reach the latest total of 768. Each step-up of 144 bunches represents two extra injections of 72 bunches (2 × 36) from the SPS. This is a considerable amount of beam power and the injection process needs to be carefully tuned and monitored. A few days is spent delivering physics after each step-up to check the performance of the machine and make sure that no intensity-dependent effects are compromising machine protection.
The push-up in the number of bunches will continue towards a potential maximum for the year of around 1400.
The space shuttle Endeavour launched successfully on its 25th and final spaceflight on 16 May at 08.56 a.m. local time. It carried the Alpha Magnetic Spectrometer (AMS-02), designed to operate as an external module on the International Space Station (ISS). Endeavour was scheduled to dock with the ISS nearly 48 hours later, on 18 May, for a 16-day mission. A problem with an auxilliary power unit had led to the last-minute postponement of the earlier planned launch on 29 April.
AMS will study the universe and its origin by searching for antimatter and dark matter while performing precision measurements of the composition and flux of cosmic rays. There will be more about the mission in the next issue of CERN Courier.
“Designing in an open environment is definitely more fun than doing it in isolation, and we firmly believe that having fun results in better hardware.” It is hard to deny that enthusiasm is inspiring and that it can be one of the factors in the success of any enterprise. The statement comes from the Manifesto of the Open Hardware Repository (OHR), which is defined by its creators as a place on the web where electronics designers can collaborate on open-hardware designs, much in the philosophy of the movement for open-source software. Of course, there is more to this than the importance of enthusiasm. Feedback from peers, design reuse and better collaboration with industry are also among the important advantages to working in an open environment.
The OHR was the initiative of electronics designers working in experimental-physics laboratories who felt the need to enable knowledge-exchange across a wide community and in line with the ideals of “open science” being fostered by organizations such as CERN. “For us, the drive towards open hardware was largely motivated by well meaning envy of our colleagues who develop Linux device-drivers,” says Javier Serrano, an engineer at CERN’s Beams Department and the founder of the OHR. “They are part of a very large community of competent designers who share their knowledge and time in order to come up with the best possible operating system. They learn a lot and have lots of fun in the process. This enables them to provide better drivers faster to our CERN clients,” he continues. “We wanted that, and found out that there was no intrinsic reason why hardware development should be any different. After all, we all work with computers and the products of our efforts are also binary files, which later become pieces of hardware.”
One of the main factors leading to the creation of the OHR was the wish to avoid duplication by simply sharing results across different teams that might be working simultaneously towards the solution of the same problem. Sharing the achievements of each researcher in the repository also results in an improved quality of work. “Sharing design effort with other people has forced us to be better in a number of areas,” states Serrano. “You can’t share without a proper preliminary specification-phase and good documentation. You also can’t share if you design a monolithic solution rather than a modular one from which you and others can pick bits and pieces to use in other projects. The first time somebody comes and takes a critical look at your project it feels a bit awkward, but then you realize how much great talent there is out there and how these people can help, especially in areas that are not your main domain of competence.”
Two years after its creation, the OHR currently hosts more than 40 projects from institutes that include CERN, GSI and the University of Cape Town. Such a wealth of knowledge in electronics design can now be shared under the newly published CERN Open Hardware Licence (OHL), which was released in March and is available on the OHR. “In the spirit of knowledge sharing and dissemination, this licence governs the use, copying, modification and distribution of hardware design documentation, and the manufacture and distribution of products,” explains Myriam Ayass, legal adviser of the Knowledge and Technology Transfer Group at CERN and author of the CERN OHL. The documentation that the OHL refers to includes schematic diagrams, designs, circuit or circuit-board layouts, mechanical drawings, flow charts and descriptive texts, as well as other explanatory material. The documentation can be in any medium, including – but not limited to – computer files and representations on paper, film, or other media.
The introduction of the CERN OHL is indeed a novelty in which the long-standing practice of sharing hardware design has adopted a clear policy for the management of intellectual property. “The CERN–OHL is to hardware what the General Public Licence is to software. It defines the conditions under which a licensee will be able to use or modify the licensed material,” explains Ayass. “The concept of ‘open-source hardware’ or ‘open hardware’ is not yet as well known or widespread as the free software or open-source software concept,” she continues. “However, it shares the same principles: anyone should be able to see the source (the design documentation in case of hardware), study it, modify it and share it. In addition, if modifications are made and distributed, it must be under the same licence conditions – this is the ‘persistent’ nature of the licence, which ensures that the whole community will continue benefiting from improvements, in the sense that everyone will in turn be able to make modifications to these improvements.”
Despite these similarities, the application of “openness” in the two domains – software and hardware – differs substantially because of the nature of the “products”. “In the case of hardware, physical resources must be committed for the creation of physical devices,” Ayass points out. “The CERN OHL thus specifically states that manufacturers of such products should not imply any kind of endorsement or responsibility on the part of the designer(s) when producing and/or selling hardware based on the design documents. This is important in terms of legal risks associated with engaging in open-source hardware, and properly regulating this is a prerequisite for many of those involved.”
The OHR also aims to promote a new business model in which companies can play a variety of roles, design open hardware in collaboration with other designers or clients and get paid for that work. As Serrano explains: “Companies can also commercialize the resulting designs, either on their own or as part of larger systems. Customers, on their side, can debug designs and improve them very efficiently, ultimately benefiting not only their own systems but also the companies and other clients.”
“The fact that the designs are ‘open’ also means that anyone can manufacture the product based on this design – from individuals to research institutes to big companies – and commercialize it. This is one approach of technology transfer that nicely combines dissemination of the technology and of the accompanying knowledge,” adds Ayass. This combining of an innovative business model and the OHL is finding a positive response in the commercial world. “We are very excited because we are proving that there is no contradiction between commercial hardware and openness,” says Serrano, who concludes: “The CERN OHL will be a great tool for us to collaborate with other institutes and companies.”
At the end of March, an electron beam was steered round the ring of a new type of particle accelerator and successfully accelerated to 18 MeV for the first time. EMMA (Electron Model for Many Applications) is a proof-of-principle prototype built at the UK Science and Technology Facilities Council’s Daresbury Laboratory to test the concept of the non-scaling fixed-field alternating gradient accelerator (FFAG). The technique should allow the construction of a new generation of more powerful, yet more compact and economical accelerators.
The successful acceleration – a “world first” – confirms not only that the design of the most technically demanding aspects of EMMA is sound but it also demonstrates the feasibility of the technology used. The next steps will be to move towards full acceleration, from 10 to 20 MeV, and commence the detailed characterization of the accelerator.
The basic concept underlying EMMA is that of the FFAG, in which a ring of fixed-field magnets simultaneously steers and focuses the electron beam round the machine. The focusing is as strong as in an alternating-gradient synchrotron but the beam spirals outwards while it is accelerated, as in a cyclotron. However, with sufficiently strong magnetic focusing the displacement of the beam as it accelerates and spirals can be kept much smaller than in other types of accelerator. This makes the FFAG concept attractive for a range of applications, from treating cancer to powering safer nuclear reactors that produce less hazardous waste.
The design of EMMA’s magnet ring presented several challenges. The focusing magnets have a standard quadrupole geometry but they are used to steer the beam by offsetting it horizontally. The magnets are short, so “end effects” become important, and pairs of magnets are closely spaced around the ring, so the interaction between magnets is non-trivial.
• EMMA is a major part of the British Accelerator Science and Radiation Oncology Consortium CONFORM project and is funded by the Research Councils UK (RCUK) Basic Technology programme.
The winter technical stop saw the final steps of the installation of the TOTEM experiment at the LHC. After 8 years of development, the two arms of the inelastic telescope T1 were successfully installed inside the CMS endcap at about 10.5 m on either side of the interaction point. This detector joins the previously installed telescope T2 (at 13.5 m), as well as detectors in two sets of Roman Pots at 147 m and 220 m. Additional detectors at 147 m were also installed in the shutdown.
TOTEM is designed to make precise measurements of the total proton–proton cross-section and to perform detailed studies of elastic and diffractive proton–proton scattering. It requires dedicated runs of the LHC at low luminosities to allow the movable Roman Pots to bring detectors as close to the beam as possible.
A month after restarting in February, the LHC was once again breaking records. Following a period of commissioning, the first run with stable beams for physics at 7 TeV in the centre-of-mass began on 13 March, with a modest three bunches per beam and a luminosity of 1.6 × 1030 cm–2s–1. Then, after further machine-protection tests, the way was opened for increasing numbers of bunches to be introduced in “fills” for physics, culminating with 200 bunches per beam on the evening of 22 March. This gave a peak luminosity at ATLAS and CMS of 2.5 × 1030 cm–2s–1, comfortably beating last year’s record made with 368 bunches. By 25 March, the LHC had delivered an integrated luminosity of 28 pb–1, more than half of the total delivered in 2010.
The next challenge was to have not only more bunches but also at a closer spacing; 2010 saw running with 368 bunches with 150 ns spacing, while the run with 200 bunches this March was with 75 ns spacing. However, combining small bunch spacing with a high number of bunches leads to an effect known as “electron cloud”: synchrotron radiation from the protons releases electrons at the beam-screen, which are pulled towards the protons and knock out more electrons on hitting the opposite wall.
After a brief technical stop for maintenance, the operating team began a period of “scrubbing runs”, in which a high beam current is injected at low energy to induce electron clouds under controlled conditions. The aim is to release gas molecules trapped inside the metal, to be pumped out later, and decrease the yield of electrons at the surface. These runs had already paid off by 10 April when the number of bunches per beam reached 1020, with a total of 1014 protons per beam – another record for the LHC.
Research teams think that there is little damage, if any, to the two large particle-physics experiments in the Soudan mine in Minnesota, following a fire in the access shaft on 17 March, which shut down both the mine and the experiments located 800 m underground.
When the fire was detected at around 9 p.m., the fire-protection system shut down the power to the Soudan Underground Laboratory, as designed. No personnel were in the mine at the time. The cause of the fire is believed to be linked to shaft-maintenance work earlier in the day.
Fire fighters extinguished the fire by pumping some 265,000 litres of water and fire-extinguishing foam down the access shaft. Some of the foam entered the caverns of the underground laboratory, which is managed by the University of Minnesota. The laboratory houses the 5000-tonne far detector of the Main Injector Neutrino Oscillation Search (MINOS) experiment, the Cryogenic Dark Matter Search (CDMS) experiment, managed by Fermilab, and several other smaller experiments.
Ten days after the fire, the first crew of scientists returned to the laboratory as electricians began restoring power. Residue of fire-fighting foam was found across large parts of the laboratory, however, researchers from CDMS found no apparent damage to their experiment. During the 10-day power outage, the experiment, which operates ultra-sensitive particle detectors at a temperature of about 50 mk, warmed to room temperature without losing vacuum. When the team turned the power back on, all cryogenic systems functioned as normal.
No water or foam was found on the electronics for MINOS. The experiment’s large electromagnetic coil was partially immersed in water and will be carefully dried out before being used once more. The coil provides the neutrino detector with a magnetic field for charged-particle identification.
There are several smaller experiments in the mine, including the CoGeNT dark-matter search. An assessment of these experiments will be made when full access to the underground laboratory is available.
Complete clean-up, final assessment and restart of the experiments will occur once a new power cable has been installed in the shaft, allowing full power to be restored to the laboratory.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.