Comsol -leaderboard other pages

Topics

CANDLE lights up research in Armenia

cerncan1_6-04

In late 2001 a new non-profit foundation, CANDLE (Center for the Advancement of Natural Discoveries using Light Emission), was established in Armenia. Its aim is the construction of a 3 GeV synchrotron light facility in the Armenian capital, Yerevan. The project is well supported by the Armenian government, which has provided the office building and 20 hectares of land for the creation of the new laboratory. In a letter to the president of the CANDLE foundation, the minister of foreign affairs of the Republic of Armenia, Vardan Oskanian, said: “We believe the project is worthy of support by the private sector, the international community, the US government and, of course, the Armenian government.”

Early in 2002 the new institute received a $500,000 (~€419,000) grant from the US State Department for the design study of the facility. This was performed under a US Department of Energy contract with the director of CANDLE, Alex Abashian from Virginia Tech in the US, as principal investigator, and the work was undertaken by a team of Armenian scientists and engineers in close collaboration with colleagues from other countries. The design report was completed only six months later, in June 2002. Meanwhile, 69 user proposals from the international scientific community were submitted to a review panel established by the US State Department with the help of the National Science Foundation. The review panel, headed by Maury Tigner from Cornell University, held a special two-day meeting in Washington in August 2002 to evaluate the scientific, technical and organizational viability, as well as international aspects, of the proposed facility.

World class

cerncan2_6-04

The review panel reported that the CANDLE project would be a “world-class facility capable of enabling frontier work across the full range of physical, life and engineering sciences” and that it was highly likely that a reasonable user community could be developed. The report also described CANDLE as “an excellent investment from a scientific/technical point of view…providing a great opportunity to be the principal third-generation synchrotron facility, not only in Armenia but also the entire region.” An important recommendation of the review panel relates to the funding of the new facility: “The committee urges the state department to consider an approach in which a construction funds commitment is contingent on the project delivering a plan for operations funding with hard commitments.”

Several leading European synchrotron laboratories – ANKA, BESSY, DESY, ELETTRA and ESRF – along with the co-ordinator of the European Round Table on Synchrotron Radiation and Free-Electron Laser, Giorgio Margaritondo, have expressed their support for the creation of the new facility in Armenia. DESY has expressed its willingness to make an in-kind contribution to the project by providing the components of the S-Band linear accelerator for the injector system. The radiofrequency components will be transported to Armenia during 2004. The Scientific Committee of the University of Provence in Marseille has also expressed its support and interest. An official delegation from the university, headed by Jean-Marc Layet, visited the new laboratory in Armenia in April 2003, when a memorandum of understanding on the co-operative programme to be conducted on the future facility was signed.

The voice of the international community indicates the high regard in which the Armenian project is held, and that the user case for the facility will be robust upon its completion. The facility will be unique within an area of 2000 km radius and will involve the huge intellectual potential of the region. In turn, the initiative of the Armenian scientists provides a good case for the promotion of synchrotron light based research worldwide.

cerncan3_6-04

Armenia’s leaders fully appreciate the value of the project for the country’s long-term development and its integration into the international scientific community. “The beneficial impact will extend well beyond the boundaries of research, producing a positive picture of a country oriented towards science and technology and is capable of mastering the most sophisticated and ambitious projects,” said Margaritondo in a letter to the Armenian minister of science and education. The Armenian National Academy of Science, Yerevan State University, the Georgian Academy of Science, Tbilisi University, and prominent scientists from Armenia and the surrounding region have all expressed their support for the CANDLE project. They consider the creation of the new light source as an “engine” for the promotion of scientific co-operation both in the region and globally.

Encouraging signs

On 26 February this year the European Parliament voted for a special amendment related to scientific co-operation with the countries of South Caucasia, specifically stating that: “the commitment of the European Union to the Armenian synchrotron facility CANDLE will be an encouraging sign for this project, which primarily concerns European scientific research teams.” At the same time the US State Department has renewed its consideration of the CANDLE project. Also in February, an official US delegation, including Barry Barish from the California Institute of Technology, visited the CANDLE laboratory to inspect the project and review the steps necessary for the continuation of the project.

More recently, a special session on “Synchrotron radiation research in developing countries and international scientific co-operation”, organized by Herman Winick of the Stanford Synchrotron Radiation Laboratory and sponsored by the American Physical Society, took place in Montreal, Canada, on 22 March. In the session the regional impact of the Brazilian Light Source in South America, SESAME in the Middle East and SIAM in Southeast Asia were presented, together with the CANDLE project, which was described in an aptly titled talk: “Rejuvenating science in Armenia and its neighbours”. Several US officials and experts attended the session, including Ray Orbach, director of the Office of Science at the Department of Energy.

cerncan4_6-04

Alex Abashian hopes that this latest effort will result in a green light for the CANDLE pre-construction stage during 2004, in line with the funding schedule and based on the report of the review panel. The pre-construction phase includes an extensive prototyping programme; test stands for radiofrequency, vacuum and magnet systems; site development; the establishment of machine and user advisory committees; and a number of international workshops on the facility and opportunities for users.

The vision behind the CANDLE project has been aptly summarized by the president of the foundation, Jirair Hovnanian, head of a New Jersey-based family-run building company: “It is our vision and desire that CANDLE will be an international facility that will provide opportunities for scientists in the region and beyond to have access to a user-friendly, world-class, third-generation light source. A natural by-product of CANDLE is the renewal of the scientific standards in Armenia to their past world-class level, and the provision of employment for Armenian, as well as neighbouring, scientists, both young and mature, thus reversing the brain drain from the region.” Some of these aims are already bearing fruit: as early as May 2003 the ArmElectroMash company in Armenia had successfully completed the first prototype dipole magnet for the CANDLE booster synchrotron. This positive experience has made local fabrication of the magnets and vacuum chamber for the new facility a reality, providing benefits to Armenian industry, even at the construction stage. The CANDLE leadership believes that the continuation of the project will provide a good base for the commitments by other funding sources, and will strengthen international co-operation in setting up and using the new facility.

WestGrid team announces completion of computing network in western Canada

cernnews4_5-04

Scientists leading the WestGrid project in Canada have announced that the major resources of this $48 million project are available for general use by the research community. Canadian particle physicists have already applied WestGrid successfully to ongoing experiments, and plans are underway at TRIUMF to link the WestGrid Linux cluster to the LHC Computing Grid (LCG).

The aim of the WestGrid project is to provide high-performance computing in western Canada, based on resources at several universities in Alberta and British Columbia, and at TRIUMF. It currently consists of the following: a 256 cpu shared-memory machine (SGI Origin) for large-scale parallel processing at the University of Alberta; a cluster of multiprocessors (36 x 4 cpu Alpha nodes) at the University of Calgary, connected by a high-speed Quadrics interconnect that is also for parallel jobs; a 1008 cpu Linux cluster (3 GHz IBM blades) at the University of British Columbia (UBC) and TRIUMF for serial or loosely coupled parallel jobs; and a network storage facility (IBM) at Simon Fraser University, initially with 24 TeraBytes of disk space and about 70 TeraBytes of tape. As of November 2003, the WestGrid Linux cluster at UBC/TRIUMF ranked 58th in the “TOP500 Supercomputer Sites” rankings.

The Grid-enabled infrastructure also includes major collaborative facilities known as Access Grid nodes, with a total of seven institutions interconnected over dedicated research “lightpaths” on the existing provincial and national research networks. The new resources are expected to support advances in research in many disciplines where large amounts of data are typically involved, such as medical research, astronomy, subatomic physics, pharmaceutical research and chemistry.

Two particle-physics experiments, TWIST at TRIUMF and D0 at Fermilab, have already participated in the testing phase at the UBC/TRIUMF site. Both experiments benefited greatly from access to significant computing resources during the tests. For the future, it is planned to connect WestGrid indirectly to the LCG through the LCG site at TRIUMF. Work is ongoing to develop the software necessary to achieve this without the need to install LCG tools on WestGrid itself.

CERN’s giant fridge

cernlhc1_5-04

When the Large Hadron Collider (LHC) begins operation at CERN it will be one of the coldest places on Earth. To keep the protons on course around the LHC and at the same time attain as high an energy as reasonably possible requires powerful superconducting magnets, which will operate at a temperature of 1.9 K – only 1.9 degrees above the absolute zero of temperature and chillier than outer space. While there may indeed be colder places in other laboratories, none will be on the scale of the LHC. The task of keeping the 27 km long structure at 1.9 K will be performed by helium, which will itself be cooled to its superfluid state in a huge refrigeration system.

The choice of the operating temperature for the LHC has as much to do with the “super” properties of helium as with those of the superconducting niobium-titanium alloy in the magnet coils. At atmospheric pressure helium gas liquefies at around 4.2 K, but when it is cooled further it undergoes a second phase change at about 2.17 K to its superfluid state. Among many remarkable properties, superfluid helium has a very high thermal conductivity, which makes it the coolant of choice for the refrigeration and stabilization of large superconducting systems. Indeed, the LHC cryogenic system will carry kilowatts of refrigeration over more than 3 km with a temperature drop from 1.9 to 1.8 K – less than 0.1 K.

From the cryogenic point of view, the LHC is a large distributed helium system operating at a variety of temperature levels down to 1.8 K. As well as being constructed in the tunnel built originally for the Large Electron Positron collider, LEP, to keep costs down the cooling system has been designed around the four 4.5 K refrigeration plants that were used to cool the superconducting radiofrequency cavities for the second phase of the LEP collider, LEP2. (The tunnel is not the only part of LEP to be “recycled” for the LHC!) The design of these refrigeration plants sets the temperature levels for the whole system at 75, 50, 20 and 4.5 K, in addition to the ultimate temperature level produced by the 1.8 K refrigeration system that provides the superfluid helium to the “cold mass” containing the superconducting coils.

The LHC will consist of eight 3.3 km long sectors, with access shafts to services on the surface only at the ends of each sector. The layout for the refrigeration system is therefore based on five “cryogenic islands” – three of which serve two sectors, while two serve a single sector each. Thus each “island” must distribute and recover coolant over a distance of 3.3 km.

Among the main components for this refrigeration system are eight 4.5 K refrigerators – one for each sector – each with a capacity of 18 kW at 4.5 K. Four of these have been recovered from LEP, and they will be upgraded to operate on the sectors with a slightly lower demand for refrigeration. The four high-load sectors will be cooled by new 4.5 K refrigerators, the last of which passed its acceptance procedures towards the end of 2003.

The refrigeration power needed to cool the 4700 tonnes of material of each sector of the LHC is enormous and can only be produced by using liquid nitrogen. Consequently, each 4.5 K refrigerator is equipped with a 600 kW liquid-nitrogen pre-cooler, which will be used to pre-cool a flow of helium down to 80 K while the corresponding sector is being cooled and later filled with helium – a procedure that will take just under two weeks. Using no liquid nitrogen but only helium in the tunnel considerably reduces the risk of oxygen deficiency in the case of accidental release.

cernlhc2_5-04

The 4.5 K refrigeration system works by first compressing the gas and then allowing it to expand. While it expands it cools by losing energy through mechanical turbo-expanders that run at up to 120,000 rpm on helium-gas bearings. Only two companies in the world provide turbo-expanders with sufficient cooling power – Air Liquide in France and Linde in Switzerland. Each of the refrigerators consists of a helium compressor station equipped with oil and water removal systems and a vacuum-insulated cold box (60 tonnes) where the process fluid is cooled, purified and liquefied. The compressor station supplies compressed helium gas at 20 bar and room temperature. The cold box houses the heat exchangers and turbo-expanders that provide the cooling capacities necessary at the different temperature levels and liquefy the helium to 4.5 K before it passes to the 1.8 K refrigeration unit. Each refrigerator is equipped with a fully automatic process control system that manages about 1000 inlets and outlets per plant.

The complete system of eight 4.5 K refrigerators takes the cooling capacity at 4.5 K to 140 kW, that is almost 40,000 litres of liquid helium per hour. This is huge progress since LEP2, to say nothing of the days before LEP when most cryogenic needs were for individual experiments (figure 1). An electrical input power of 32 MW (4 MW per refrigerator) will be needed to produce this capacity at 4.5 K.

The process of bringing together the new 4.5 K refrigerators began in 1998 when contracts were signed with Air Liquide and Linde. The design and industrial engineering phases that followed made it possible for the first deliveries to take place from the middle of 2000, while LEP was still in operation. After a period of intensive tests the first new refrigerator for the LHC, built by Air Liquide, was accepted at Point 1.8 in March 2002. It has since been used to supply the test benches for the superconducting magnets. In December 2002 a second Air Liquide refrigerator was accepted at Point 4, followed by the two refrigerators manufactured by Linde in August and December 2003 at Points 8 and 6.

Of course the story does not end with the installation of the 4.5 K refrigerators. From now on the LHC cryogenic team will focus on upgrading the four LEP refrigerators already in place at Points 2, 4, 6 and 8. Simultaneously, they will finish installing the cryogenic infrastructure and 1.8 K refrigeration units to supplement the 4.5 K system. These installations will gradually be brought into operation to test the other cryogenic assemblies (vertical transfer lines, interconnection boxes, local transfer lines and tunnel distribution lines). This should enable final adjustments to be made and provide the experience necessary to face up to the next challenge, that of cooling the first sector of the machine in 2005.

There’s plenty of room at the top

cernpart1_5-04

The universe is the largest and richest laboratory at our disposal for studying the laws of nature. In it, matter and energy undergo fundamental interactions and endure extreme conditions for infinitesimal times or for billions of years. Using advanced instruments we are able to select signals that reach us from the depths of space and time, and to extract data on fundamental physics that would not be available even with the most complex experiments performed in our laboratories.

To review the potential of particle and fundamental physics in space, a new series of international conferences, called SpacePart, began in 2002 on the Italian island of Elba. Following the success of SpacePart ’02, which was sponsored by the Istituto Nazionale di Fisica Nucleare (INFN) and the universities of Pisa and Perugia, NASA played host to the 2003 conference on 10-12 December in Washington, DC, jointly with Stanford’s Kavli Institute for Particle Astrophysics and Cosmology, the Massachusetts Institute of Technology (MIT) and INFN. The objective of SpacePart ’03 was to explore the possibilities of doing fundamental physics in space during the next 20 years. The meeting was attended by researchers from a number of diverse subfields of physics and astrophysics – from cosmology and gravitation to elementary particle physics – but all of them had a common interest in space-based experiments.

Coming together

Patrick Looney from the US Office of Science and Technology Policy delivered the opening address, representing the presidential science advisor, Jack Marburger. Looney underlined the importance of bringing agencies and research programmes together, in order to talk about the connectedness of the physical sciences and to show its relevance to the broader quest for discoveries in the universe. Indeed, SpacePart ’03 was a perfect example of this, as all of the US agencies currently active in space science – NASA (Mike Salamon), the Department of Energy (Ray Orbach) and the National Science Foundation (Mike Turner) – presented their programmes. The major European and Japanese agencies also participated, with representatives from ESA (Oliver Jennrich), INFN (Roberto Battiston), ASI (Simona Di Pippo) and ISAS/JAXA (Tadayuki Takahashi).

Frank Wilczek of the Massachusetts Institute of Technology (MIT) gave an inspirational keynote talk, in which he described the universe as a strange place that is characterized by basic numbers – such as the densities of matter, dark matter and dark energy – that we are completely unable to explain. New theoretical ideas are desperately needed to explain dark energy, as the cosmological constant may not be adequate. In addition, Wilczek pointed out that axions might very well be a better candidate for dark matter than the supersymmetric neutralino.

cernpart2_5-04

To try to answer these questions, a number of space-borne experiments are being prepared to study the various components of the cosmic radiation. The charged energetic part – cosmic rays – will be measured up to the TeV region with very high accuracy by the magnetic spectrometers PAMELA (in 2005) and AMS-02 (in 2007). Sam Ting of MIT/CERN gave a status report on AMS-02, the first superconducting spectrometer to be operated in space, which will, by the end of the decade, reach an accuracy of one part in a billion in the search for antimatter nuclei.

Jonathan Feng of University College Irvine discussed the perspectives for the indirect detection of dark matter in space-based experiments. He reviewed the various possible scenarios for weakly interacting massive particles (WIMPs): the lightest supersymmetric particle (LSP) based on Bino-Higgsino mixings, Kaluza-Klein dark matter and superWIMP dark matter if the gravitino is the LSP. These models predict a distortion in the spectrum of electrons, positrons and gamma rays, which may be visible with new cosmic-ray experiments such as PAMELA, and in particular with the high statistics expected from AMS-02 and the Gamma Ray Large Area Space Telescope, GLAST.

Hard hitters

The region of extremely energetic cosmic rays, above 1020 eV, will also be actively pursued because of its scientific interest. Angela Olinto of Chicago reviewed the experimental situation regarding these messengers of the extreme universe – particles that carry the energy of a tennis ball served by a top-class player. Even if we do not yet understand where these particles come from and how they reach such huge energies, “Zevatron” (1021 eV) accelerators do seem to exist somewhere in our universe. The advent of the Auger project on the ground, followed at the end of the decade by the Extreme Universe Space Observatory on the International Space Station, will open the way to cosmic-ray astrophysics, as the highest energy particles are so energetic they can traverse vast regions of the universe without significant deflection. Perhaps more exciting, but still more uncertain, would be the possible detection of extremely energetic neutrinos, as discussed by Tom Weiler from Vanderbilt who stressed how significantly the rate predictions are affected by the uncertainty in the neutrino cross-section at these energies.

cernpart3_5-04

Neutron stars have been known of for some time now, but we are still not completely sure whether some of them could be made of strange matter instead of neutrons. A fragment of a strange star – a strangelet – would behave like a cosmic ray with an anomalously low charge-to-mass ratio. Jack Sandweiss of Yale reviewed the status of strangelet searches. A strangelet could be detectable in a particle spectrometer such as AMS-02, and indeed one event compatible with the strangelet hypothesis was observed by AMS-01 during the successful 1998 STS91 mission. A strangelet would, however, also have a peculiar interaction with the Earth, giving rise to a unique pattern of “epilinear” seismic signals that would indicate a linear source. Such events have been searched for: in more than a million seismic events recorded in the years 1990-1993, one puzzling event that could be compatible with the passage of a strange-matter nugget through the Earth has been found.

By the light of the Moon

Our pale satellite, the Moon, has been a source of poetical inspiration for thousands of years. It is possibly less well known, however, that since 1969 the Moon has also been a source of high-precision tests of general relativity through reflections of a laser beam from mirrors located on the lunar surface. Advances in detector technology will allow a tenfold gain in sensitivity in these measurements in the coming months. The APOLLO experiment at the 3.5 m telescope at the Apache Point Observatory in New Mexico will perform lunar ranging with millimetre resolution. Each pulse to the Moon will contain 1600 laser photons within a 95 picosecond jitter, of which about one photon will be detected back on Earth with a timing accuracy corresponding to a resolution of about 20 mm. Millimetre sensitivity can be obtained within one minute, a time that should be short enough to beat systematic effects. Among other things, this accuracy would be sufficient to test the weak equivalence principle in the 10-14 region. Next summer, laser ranging from Apache Point will also include tests using the Mercury Orbiter spacecraft, while in future additional improvements in accuracy might come from a laser-ranging experiment on Mars.

A major improvement in sensitivity is expected from a new NASA mission under review, the Laser Astrometric Test of Relativity (LATOR), which was presented at the conference by Slava Turyshev from the Jet Propulsion Laboratory. LATOR will be based on inter-spacecraft laser ranging, and is aiming to improve dramatically the accuracy on the γ factor from about 2 x 10-5 (the already precise result obtained in 2003 using the Cassini spacecraft) to the interesting sensitivity region of 10-8. A test of the LATOR concept is also planned on the International Space Station.

cernpart4_5-04

A “cool” subject at SpacePart ’02 had been the cosmic microwave background (CMB). However, with the Wilkinson Microwave Anisotropy Probe now operating and the increase in sensitivity with its successor, PLANK, still awaited, the new hot topic at SpacePart ’03 was CMB polarization. Matias Zaldarriaga of Harvard presented recent data from the DASI instrument, showing that the CMB is weakly polarized. Polarization of the CMB, due to Thomson scattering at the recombination time, links matter density-fluctuations to gravitational wave and lensing effects. Since the power spectrum of cosmological gravitational waves depends on the inflation mechanism, measurements of the CMB polarization should give a glimpse, through the recombination time, into the very early phases of the universe.

Emerging technologies

At the beginning of the next decade the direct detection of gravitational waves by LISA, the three-satellite ESA-NASA interferometer with arms that are five million kilometres long, will open up a new kind of astronomy that is able to see through very dense regions of our galaxy and even behind the recombination shroud. Stefano Vitale of Trento discussed the status of ESA’s LISA pathfinder mission, SMART2, which is planned in 2006 to test basic technological aspects of this ambitious project.

A special session was devoted to emerging technologies that should further improve the physics reach of astroparticle physics in space. Mark Kasevich of Stanford reviewed the prospects for Bose-Einstein interferometry in space, which would allow the construction of ultra-sensitive accelerometers to test gravitational effects to an outstanding level of accuracy. This technology is developing at an incredible pace because of the many applications in the more mundane field of navigation systems and geodesy measurements. Gert Viertel of ETH Zürich presented results from a prototype synchrotron radiation detector flown on the STS108 mission on the space shuttle Endeavour. This detector makes use of the X-ray emission from TeV electrons/positrons interacting with the magnetosphere to measure the flux and charge of these particles in a high-energy window that is not covered by magnetic spectrometers.

cernpart5_5-04

Il Park of Ewha Woman’s University in Korea presented a new concept of adaptive optics for a large field of view based on micro-machined movable mirrors, which are capable of following rapidly changing light sources in real time, to monitor for example the emission of an extremely energetic cosmic-ray shower in the atmosphere. Tadayuki Takahashi of ISAS/JAXA also stressed the role of micro and nanotechnologies for future astroparticle-physics experiments in space in building and exploiting cheaper, smaller satellites with a faster turnaround time.

Easier access to space is certainly needed for the growth of this exciting field if it is to attract young talents and exploit new ideas. Indeed, astroparticle physicists have only started to scratch the surface of the potential of space for fundamental physics, and there is still, to paraphrase a well known saying, “plenty of room at the top”. SpacePart ’04 is to be held in Beijing at the end of 2004 and will be sponsored by the Chinese Ministry of Science and Technology. See you there!

FINUDA’s first results open up new window on exotic nuclei

The first results from the FINUDA experiment at INFN’s Frascati National Laboratory show that the detector is performing well and is in good shape for its future studies of hypernuclear physics. At the XLII International Winter Meeting on Nuclear Physics in Bormio, Italy, at the end of January, the FINUDA team presented data on the performance of the detector, as well as preliminary observations of the formation of hypernuclei and their decay spectra.

cernnews1_4-04

The FINUDA detector was installed at the DAFNE “φ factory” in Frascati in the spring of 2003. The experiment makes use of the low-energy negative kaons emitted in the decays of the φ particles created in DAFNE. The decays produce an almost monochromatic beam of K with an energy of about 16 MeV. These low-energy K can come to a stop in thin targets and interact with nuclei via a strangeness-exchange reaction, where the strangeness of the kaon is transferred to a nucleus in which a neutron (containing udd quarks) becomes a lambda particle (uds).

The use of thin targets means that the FINUDA experiment can make the most of its intrinsic momentum resolution in order to provide high-resolution measurements of hypernuclear energy levels. In addition, the apparatus is designed to detect charged and neutral particles with large angular coverage and high statistics. The experiment can also measure spectra from different targets at the same time, so reducing the number of possible systematic errors.

cernnews2_4-04

Once the commissioning of FINUDA was complete in October 2003, data taking could begin with a set of targets of different nuclei – 6Li, 7Li, 12C, 27Al and 51V – that were chosen to allow a variety of simultaneous studies of the formation and decay of hypernuclei. The targets form an octagon surrounding the interaction region, where the K are produced in the decay of φ particles to K+K pairs. Within the target array, thin slabs of scintillator detect the highly ionizing low-energy kaons. Hypernuclear-formation events are selected by a trigger that picks out K+K pairs accompanied by a fast particle (a pion) coming from the interaction of the K in a target (see figure 1).

The data collected so far indicate a momentum resolution, Δp/p of 1.1% full width at half maximum (FWHM), corresponding to a resolution of approximately 2.5 MeV on hypernuclear energy levels (see figure 2). This value should improve after final calibration and detector alignment. The indications are that during its first phase of data taking FINUDA should collect about 105 useful events per target – which is enough for some high-resolution spectroscopy on the various nuclei.

First module for the CMS solenoid magnet heads from Genova to Geneva…

cernnews4_3-04

The first of five modules that will form the superconducting solenoid magnet for the CMS experiment at CERN was ready to leave the Italian port of Genova at the end of January, subject to good weather conditions. The magnet, which has an inside diameter of 6.3 m and a length of 12.5 m, has required a modular construction to allow transportation from the fabrication site in Italy to CERN. The five modules, each 2.5 m long and weighing 45 tonnes, are being transported one by one to CERN, where they will be assembled into the final solenoid.

The solenoid, which represents the “S” in CMS (Compact Muon Solenoid), is the product of an international collaboration between the French Commissariat á l’Energie Atomique (CEA), CERN, the Italian National Institute for Nuclear Physics (INFN), ETH (Polytechnic of Zürich) and Ansaldo Superconductors of Genova. Ansaldo was entrusted with the construction of the five modules constituting the magnet, which generates a magnetic field of 4 T. Once completed, the superconducting solenoid will boast a notable record: with its 2.6 Gigajoule of energy it will hold the world record for energy stored in a magnet.

INFN has been responsible for the design and construction of the so-called cold mass, i.e. the coil and mechanical structures that will be cooled to 4.2 K. Construction of the coil has required the development of some innovative technologies. Since the magnetic field is so high and the device so big, large electromagnetic forces are generated inside the solenoid causing mechanical deformation that could prevent it from working. The standard solution for such problems is to use a reinforcing mechanical structure to contain the solenoid, but this would not have been sufficient in this case. To avoid the smallest deformation, which would make the cables lose their superconducting properties, the reinforcement has been inserted directly inside the cables. This innovative solution required remarkable technical skills. It was also necessary to develop a sophisticated automated winding system to form the solenoid coils with high geometrical precision.

…while the ATLAS solenoid approaches its final position

cernnews5_3-04

The ATLAS superconducting solenoid has been moved for nearly the last time and is now in position in the assembly hall on the Meyrin site at CERN, opposite the cryostat that will house the liquid-argon electromagnetic calorimeter. All that remains to do now is to slide the solenoid into the insulating vacuum vessel.

Built by Toshiba, under the responsibility of KEK in Japan, the solenoid is 2.4 m in diameter, 5.3 m long and weighs 5.5 tonnes. Its axial magnetic field of 2 Tesla will deflect the particles inside the ATLAS inner detector. The inner detector, which consists of three sub-detectors, will be installed inside the solenoid at a later date, before the complete structure is transported across the road from the main site to the ATLAS cavern at Point 1 on the ring of the Large Hadron Collider.

SPEAR ring comes to life again as a dazzling new synchrotron light source…

cernnews6_3-04

The latest reincarnation of the famous SPEAR storage ring – SPEAR3 – was formally opened at a dedication ceremony at the Stanford Linear Accelerator Center (SLAC) on 29 January. The new synchrotron light source incorporates the latest technology – much of it pioneered at the Stanford Synchrotron Radiation Laboratory (SSRL) and SLAC – to make it competitive with the best synchrotron sources in the world. Some 2000 scientists will use the new machine’s extremely bright X-ray light each year in studies ranging from materials science to structural biology.

Thirty years ago SSRL was among the first laboratories in the world to use synchrotron-produced X-rays for studying matter at atomic and molecular scales, and the first to offer beam time to a broad user community of scientists from academic, industry and government laboratories (based on peer-reviewed proposals). The original SPEAR ring, built for particle-physics research at SLAC, yielded several major discoveries in the field and also provided fertile ground for innovating synchrotron techniques.

cernnews7_3-04

SPEAR3 is a complete rebuild and upgrade of the SPEAR2 ring, with all the magnets, vacuum and radiofrequency systems having been replaced. The new ring also has the capacity for eight to 10 more beam lines, with associated experimental stations. A gift of $14.2 million (~€11.2 million) from the Gordon and Betty Moore Foundation to the California Institute of Technology will allow scientists at Caltech and Stanford University to collaborate on the building of a designated beam line for structural molecular-biology research. The quality and brightness of SPEAR3’s X-ray light is well suited to studying complex biological systems.

The SPEAR3 project was completed on time and to the budget of $58 million (~€46 million). The first electron beams circulated in the new ring in mid-December 2003 and the first experiments are scheduled to begin in March this year.

…and STELLA lights way to a better electron accelerator

Researchers at the Brookhaven National Laboratory have developed a compact linear accelerator that uses laser light to accelerate electrons with better efficiency and energy characteristics than before. The results from the experimental device, called Staged Electron Laser Acceleration (STELLA), may help in the development of linear accelerators that can deliver higher energies than are currently possible using microwaves, without becoming unfeasibly long and expensive.

The STELLA experiment was performed at Brookhaven’s Accelerator Test Facility (ATF) by a collaboration from Brookhaven and the universities of California at Los Angeles, Stanford and Washington, together with STI Optronics. Electrons from a standard linac at the ATF were injected into STELLA with an initial energy of 45 MeV. The electrons were then directed into an inverse free-electron laser (IFEL), with powerful permanent magnets that force the electrons to “wiggle” so that they radiate. Simultaneously, a laser beam was sent through the IFEL. The laser effectively regulated the electrons’ energy, speeding up the slower electrons and slowing down the faster ones. As a result, the slow electrons caught up with the fast ones and grouped together to create “microbunches”, each with a length of only 1 micrometre. This is one of the shortest microbunches ever created, its importance being that it makes it possible to accelerate the electrons without leaving many behind. Additionally, the microbunches were separated by only 10.6 micrometres, a distance equal to the wavelength of the laser beam.

The microbunches then travelled into a second IFEL, where they were accelerated by the laser beam to a higher energy, while staying tightly grouped. This process is in fact the essence and most difficult part of the experiment. However, the researchers were able to accelerate the microbunches without leaving behind a large number of electrons: up to 80% of the electrons remained trapped in the microbunches. Moreover, STELLA accelerated the electrons from 45 MeV to more than 54 MeV. The trapped electrons also remained more or less monoenergetic, with more than two-thirds staying within 0.36% of 54 MeV.

Gigabits, the Grid and the Guinness Book of Records

cernntwone1_3-04

On five separate occasions during 2003, a team led by Harvey Newman of Caltech and Olivier Martin of CERN established new records for long-distance data transfer, earning a place for these renowned academic institutions in the Guinness Book of Records. This year, new records are expected to be set as the performance of single-stream TCP (Transmission Control Protocol) is pushed closer towards 10 Gbps (gigabits per second). In 1980 “high speed” meant data transfers of 9.6 kbps (kilobits per second), using analogue transmission lines. So the achievement of 10 Gbps in 2004 corresponds to an increase by a factor of 1 million in 25 years – an advance that is even more impressive than the classic “Moore’s law” of computer processing, in which the number of transistors per integrated circuit (i.e. the processing power) follows an almost exponential curve, increasing by a factor of two every 18 months, or 1000 every 15 years.

cernntwone2_3-04

While chasing such records may sound like an irrelevant game, the underlying goal is of great importance for the future of data-intensive computing Grids. In particular, for CERN and all the physicists across the world working on experiments at the Large Hadron Collider (LHC), the LHC Computing Grid will depend critically on sustainable multi-gigabit per second throughput between different sites. The evolution of such long-distance computing capabilities at CERN has been an important part of CERN’s development as a laboratory, not only for European users but also for those across the globe.

The early days

Computer networks have been of increasing importance at CERN since the early 1970s, when the first links were set up between experiments and the computer centre. The first external links, for example to the Rutherford Laboratory in the UK, were only established during the late 1970s and had very limited purposes, such as remote job submission and output file retrieval. Then from 1974 onwards, together with the EARN/BITnet and UUCP mail network initiatives, there was an extraordinary development in electronic mail. However, it was only in the late 1980s that the foundations for today’s high-speed networks were truly laid down. Indeed, the first international 2 Mbps (megabits per second) link was installed by INFN during the summer of 1989, just in time for the start-up of CERN’s Large Electron Positron collider. However, there was still no Europe-wide consensus on a common protocol, and as a consequence multiple backbones had to be maintained, e.g. DECnet, SNA, X25 and TCP/IP (TCP/Internet Protocol).

cernntwone3_3-04

Back in late 1988, the National Science Foundation (NSF) in the US made an all-important choice when it established NSFnet, the first TCP/IP-based nationwide 1.5 Mbps backbone. This was initially used to connect the NSF-sponsored Super Computer Centers and was later extended to serve regional networks, which themselves connected universities. The NSFnet, which is at the origin of the academic as well as the commercial Internet, served as the emerging commercial Internet backbone until its shutdown in 1995.

In 1990 CERN picked up on this development – not without courage – and together with IBM and other academic partners in Europe developed the use of EASInet (European Academic Supercomputer Initiative Network), a multi-protocol backbone that took account of Europe’s networking idiosyncrasies. EASInet, which also provided a 2 Mbps TCP/IP backbone to European researchers, had a 1.5 Mbps link to NSFnet through Cornell University and was at the origin of the European Internet, together with EBONE. These developments established TCP/IP as the major protocol for Internet backbones around the world.

The Internet2 land-speed records

In 2000, to stimulate continuing research and experimentation in TCP transfers, the Internet2 project, a consortium of approximately 200 US universities working in partnership with industry and government, created a contest – the Internet2 land-speed record (I2LSR). This involves sending data across long distances by “terrestrial” means – that is, by underground as well as undersea fibre-optic networks, as opposed to by satellite – using both the current Internet standard, IPv4, and the next-generation Internet, IPv6. The unit of measurement for the contest is bit-metres per second – a very wise and fair decision as the complexity of achieving high throughput with standard TCP installations, e.g. on Linux, is indeed proportional to the distance.

cernntwone4_3-04

In 2003 CERN and its partners were involved in several record-breaking feats. On 27-28 February a team from Caltech, CERN, LANL and SLAC entered the science and technology section of the Guinness Book of Records when they set an IPv4 record with a single 2.38 Gbps stream over a 10,000 km path between Geneva and Sunnyvale, California, by way of Chicago. Less than three months later, a new IPv6 record was established on 6 May by a team from Caltech and CERN, with a single 983 Mbps stream over 7067 km between Geneva and Chicago.

However, thanks to the 10 Gbps DataTAG circuit (see “DataTAG” box), which became available in September 2003, new IPv4 and IPv6 records were established only a few months later, first between Geneva and Chicago, and then between Geneva, California and Arizona. On 1 October a team from Caltech and CERN achieved the amazing result of 38.42 petabit-metres per second with a single 5.44 Gbps stream over the 7073 km path between Geneva and Chicago. This corresponds to the transfer of 1.1 terabytes of physics data in less than 30 minutes, or the transfer of a full-length DVD to Los Angeles in about 7 seconds.

cernntwone5_3-04

Then in November a longer 10 Gbps path to Los Angeles, California and Phoenix, Arizona, became available through Abilene, the US universities’ backbone, and CALREN, the California Research and Education Network. This allowed the IPv4 and IPv6 records to be broken yet again on 6 November, achieving 5.64 Gbps with IPv4 over a path of 10,949 km between CERN and Los Angeles, i.e. 61.7 petabit-metres per second. Five days later, a transfer at 4 Gbps with IPv6 over 11,539 km between CERN and Phoenix through Chicago and Los Angeles established a record of 46.15 petabit-metres per second.

As with all records, there is still ample room for improvement. With the advent of PCI Express chips, faster processors, improved motherboards and better 10GigE network adapters, there is little doubt it will be feasible to push the performance of single-stream TCP transport much closer to 10 Gbps in the near future, that is, well above 100 petabit-metres per second.

As Harvey Newman, head of the Caltech team and chair of the ICFA Standing Committee on Inter-Regional Connectivity, has pointed out, these records are a major milestone towards the goal of providing on-demand access to high-energy physics data from around the world, using servers that are affordable to physicists from all regions. Indeed, for the first time in the history of wide-area networking, performance has been limited only by the end systems and not by the network: servers side by side have the same TCP performance as servers separated by 10,000 km.

bright-rec iop pub iop-science physcis connect