Deep in a former Minnesota iron mine, scientists and government officials recently wielded pickaxes to chip away, at least symbolically, at the mysteries surrounding the subatomic particles known as neutrinos.
The miners-for-a-day in the Soudan mine took the first steps in carving out a huge cavern, half a mile underground, that will become home to a 5000 ton particle detector. Physicists of the 200-member Main Injector Neutrino Oscillation Search (MINOS) experiment will use the detector to explore the question of neutrino mass.
Physicists once believed that neutrinos were massless, zipping through matter at the speed of light. However, recent results suggested that these elusive particles have a small mass. Even a tiny neutrino mass would have major consequences for our understanding of the nature and distribution of mass in the universe.
The MINOS experiment could detect not only the disappearance of muon neutrinos, but eventually their appearance as neutrinos of a different type tau neutrinos.
For the MINOS experiment, 120 GeV protons from Fermilab’s newly commissioned Main Injector accelerator will be used to generate an intense beam of muon neutrinos directed towards the underground Soudan detector, 730 km away. Neutrinos are reluctant to interact with matter and can pass through the Earth without any effect. However, interaction is probability not zero, so an intense beam and/or a large detector will record some interactions.
From early 2003, MINOS collaborators will use the detector to determine whether some of the muon neutrinos in the beam have changed to another type, known as tau neutrinos. Such a change, or oscillation from one type to another, would constitute clear evidence of neutrino mass and would allow physicists to begin to calculate just how much mass the particles possess. Locating the detector far below ground screens out cosmic rays that would otherwise flood the detector with irrelevant signals.
Earlier neutrino experiments detected fewer naturally-occurring neutrinos than expected, so they concluded that one type of neutrino had oscillated into another and, hence, had “disappeared” from detection. In contrast, the MINOS experiment could detect not only the disappearance of muon neutrinos, but eventually their appearance as neutrinos of a different type tau neutrinos.
To monitor the evolution of the neutrino beam, MINOS will use two detectors the “near detector” at Fermilab and the 5400 tonne “far detector” in the Soudan tunnel. The design initially proposed foresees fine-grained iron-scintillator sandwich calorimeters providing both tracking and energy deposition information.
On the other side of the Pacific Ocean is a similar study K2K using neutrinos provided by the accelerator at the Japanese KEK Laboratory in Tsukuba. The particles are directed towards the Super-Kamiokande underground detector 250 km away. K2K is recording its first data.
At CERN, physicists are studying a proposal to direct a neutrino beam towards the Italian Gran Sasso underground laboratory. Curiously, like MINOS, it is 730 km from the accelerator neutrino source.
During August, work continued to establish circulating beams of nuclei in the two rings of Brookhaven’s Relativistic Heavy Ion Collider (RHIC). Equipped with superconducting magnets, the machine operates at 4.6 K.
The beam was injected and stored in the first (“blue”) ring. Lifetimes of up to 45 min and modest acceleration about 1 GeV per nucleon were achieved. A few apparent obstacles will be investigated when the ring is warmed up.
Thousands of turns of beams have been seen briefly in the second (“yellow”) ring, with successful radiofrequency capture. Long lifetimes have not yet been established, nor has acceleration been performed. While no apparent obstacles have been found in this ring, steering and second turn closure in the injection region were found to be more difficult than in the first ring, as was steering through the dump area.
On 1 June, after five years of planning and five years of construction, the BELLE detector recorded its first B meson events at the KEKB electronpositron B-factory in Tsukuba, Japan. During this first run, all BELLE systems were active, including the electron-identification software, which saw a CP-violating candidate decay into a J/psi and a short-lived kaon, with the former producing an electronpositron pair. This marks the real start of the KEKB factory and its quest to investigate the puzzle of CP violation.
The KEKB collider is an energy-asymmetric collider, its two rings handling 3.5 GeV positrons and 8 GeV electrons in the old 3 km TRISTAN tunnel. The injection linac was upgraded from that used for TRISTAN and was ready long before the commissioning of the rings began last December.
Beams were first stored in the electron ring (HER) on 13 December and in the positron ring (LER) on 14 January. After several interruptions, scheduled and unscheduled, the stored currents rose to 500 mA, corresponding to 20 and 50% of the designed values of LER and HER respectively, by mid-April.
The limiting factors are currently believed to be radiofrequency power for HER and vertical beam blow-up due to a multibunch instability. Of the two major unscheduled breaks experienced so far, the first was provoked by a false fire alarm in the tunnel. The second came when a spot of the vacuum chamber was melted by synchrotron radiation from the final focusing quadrupole during trials of a large angle bump orbit. The chamber was quickly replaced with one of heat-resistant design.
Commissioning was halted from 1 May for the installation of the BELLE detector, which crept into the collision point at a speed of 0.5 m/min, taking about 30 min to complete the “long trip”. Beam tuning resumed on 25 May and collisions were achieved on 1 June.
Initially, currents in both beams were kept below 20 mA to guard against radiation damage to the BELLE detector’s silicon vertex detector. Beam currents have subsequently been increased to more than 100 mA.
Prior to establishing collisions, a potentially harmful 0.45 Hz vertical beam vibration was observed in the low-energy positron ring. This was tracked down to tiny magnetic fields produced by current leads running between the power supply and the main ring of KEK’s 12 GeV proton synchrotron. These leads pass through a channel above the KEKB tunnel 8 m from the beam pipes.
The proton synchrotron is currently providing a neutrino beam directed at the Super-Kamiokande detector, 250 km distant, for the high-priority K2K experiment, which is searching for confirmatory evidence for the neutrino oscillations reported by the Super-K group last year.
Fortunately for BELLE, KEKB’s first time slot for electronpositron collisions coincided with a maintenance day for the proton synchrotron, resulting in the successful test run of 1 June The troublesome stray fields have since been reduced substantially by reconfiguring the proton synchrotron’s power leads.
In addition to providing BELLE data, the successful test run verifies a number of novel KEKB design features the finite-angle beam crossing did not result in beam-destroying synchro-betatron oscillations, and the growth times of the instability due to fundamental mode shifts in the ARES energy storage radiofrequency cavities were longer than the natural damping times of the storage rings.
KEKB has achieved stable collisions with design values of single-bunch currents. The next goal is to increase the number of bunches and boost the luminosity towards the design figure of 1034 cm2 s1. As the number of ampere-hours of integrated beam current in the machine increases, the vacuum in the beam pipe continues to improve and background conditions in the BELLE detector decrease.
Currently the luminosity is estimated to be 2 x 1031 cm-2 s-1, with tens of hadron events collected every few hours. This performance should soon increase and open up a rich programme of studies of B meson decays to probe the puzzle of CP violation.
On 15 July in Newport News, Virginia, Jefferson Lab’s free-electron laser (FEL) produced infrared light at a wavelength of 3.1µm and 1.72 kW average power, thereby exceeding the kilowatt design goal. No FEL has previously exceeded 14 W.
The infrared Demo FEL is the first in a series of high-average-power, wavelength-tunable FELs being developed at Jefferson Lab for basic science, industrial applications and applied defence research. FEL development is a spinoff from the laboratory’s main mission of accelerator-based investigations into the quark structure of nuclei. The superconducting radiofrequency (SRF) electron accelerator at the heart of the FEL is derived from the technology of Jefferson Lab’s 6 GeV continuous-wave main machine.
In July’s record-setting laser operation, untapped electron-beam energy was recovered by recirculating the beam back through the driver accelerator for “deceleration.” The driver was operated at 4.4 mA and 48 MeV, approaching its maximum design power of 250 kW. Energy-recovery capability would contribute significantly to fully developed SRF-driven FELs’ cost-effectiveness.
Upcoming FEL user experiments include silicon characterization studies (by a group from Vanderbilt), laser photodeposition (Norfolk State), photoablation (William and Mary), polymer surface modification (DuPont) and metal surface modification (Armco/Virginia Power).
Crucial to Jefferson Lab’s ongoing FEL effort is the support of the US Department of Energy, the Office of Naval Research, the Commonwealth of Virginia and industry and university partners in the Laser Processing Consortium.
On 2 August, a decade after the initial commissioning at 45 GeV per beam, CERN’s LEP storage ring has collided 100 GeV beams of electrons and positrons, giving a total energy of 200 GeV and a healthy luminosity (a measure of the collision rate) of 6 x 1031 cm-2 s-1.
This year, LEP supplied colliding beams initially at 96 GeV, and then at 98 GeV. However, the 100 GeV per beam figure was always on the cards following the delivery by LEP of a single 100 GeV electron beam in a debut 1999 high-energy run.
On 16 July, a beam of gold nuclei circulated one of the two rings of Brookhaven’s Relativistic Heavy-Ion Collider (RHIC) for the first time.
Equipped with superconducting magnets, the machine operates at 4.6 K. After several “mini-ramp” cycles to test the beam-accelerating system in the first (blue) ring, the second (yellow) ring was powered in preparation for injection. The purpose of this commissioning phase is to accelerate beams in both rings for collisions. After commissioning, the RHIC research programme proper is scheduled to begin in November this year.
CERN’s contribution to industry is not limited to high technology for frontier science. Through the Scandinavian CoDisCo project, the lab is also helping to define management practice for large-scale distributed projects.
CERN’s Large Hadron Collider project represents not only the world’s largest scientific undertaking but also a unique opportunity to study project management on a global scale. The ATLAS and CMS experiments, as well as the LHC, are collaborative projects whose members, and competencies, are distributed around the world along with the suppliers providing equipment, services and raw materials. They provide ideal case-studies for the Scandinavian Connecting Distributed Competencies (CoDisCo) project.
Some companies have already begun to adapt to a global market-place, where their suppliers as well as their customers could be anywhere in the world. Internet-based information systems are enjoying a boom as companies come to rely on them more and more for document handling. Those firms that have taken the plunge report significant savings in time and money after converting from traditional paper-based document management to digital formats. Although the paperless office is almost a reality, with some three-quarters of internal document handling being done on line, companies still resort to traditional methods for their external communications and document handling.
CERN is a natural place for a distributed project management study. The physics community has already had decades to adapt itself to the realities of working with large-scale distributed collaborations. Indeed, the World Wide Web was born out of physicists’ need to communicate and share information. The LHC project, with its Engineering Data Management System, is at the vanguard of this emerging field and both the accelerator and its experiments are truly global in nature.
Funding for CoDisCo comes from the Nordisk Industrifond and seven, small and medium-sized Scandinavian companies which hope to learn from the LHC experience. The project’s goal is to define methods and tools to integrate and exploit distributed resources better by collecting distributed competencies to form a single, logical, networked entity. CoDisCo is a two-year project and was formally inaugurated with a meeting at CERN last September. Since then, three students have begun work on their theses on document management at CERN. Two partners, Aker Finnyards of Finland and Hönnun og ráðgjöf of Iceland, are already testing an Internet-based document-management system. As the project moves towards its conclusion, both industry and the increasingly global field of particle physics stand to gain.
The HERA collider at DESY, Hamburg, has been operating with proton and electron (or positron) beams since it was commissioned in 1991. The possibility of having nuclei in its superconducting proton ring emerged as an interesting option in the 19956 workshop Future Physics at HERA.
Such an electronnucleus collider would explore entirely new domains of quantum chromodynamics (QCD) the field theory of quarks and gluons deep inside protons and nuclei.
Significantly larger quark/gluon densities at very small momentum fractions carried by the struck quark/gluon would be accessible compared with the present electronproton collisions in HERA or in fixed-target experiments. This is expected to reveal a new QCD domain, where the smallness of the coupling is compensated for by a high density of gluons, which leads to novel nonlinear dynamics.
Nuclei also provide additional handles to study diffraction and shadowing phenomena, as well as quark/gluon propagation through nuclear media, related to colour phenomena, in QCD. The nucleus can also be used as a “femtodetector”, giving information on dynamics on the scale of nuclear dimensions.
The physics and accelerator aspects of electronnucleus collisions at HERA has been investigated over the past three years. The project is considered to be a major future direction for nuclear physics he Nuclear Working Group of the OECD Megascience Forum endorsed the project as one of three major directions for electronnucleus physics.
To pursue these efforts further, DESY held the workshop Physics with HERA as Electron-Nucleus Collider on 25-26 May. About 70 participating theorists, experimentalists and accelerator experts reviewed the latest developments and examined the feasibility in terms of accelerator and detector requirements. In the welcome address the then DESY research director, Albrecht Wagner, emphasized the need for a detailed evaluation of the discovery potential of the project.
Review talks covered various theoretical, experimental and accelerator aspects. Important recent developments were also reported in many shorter talks. An indication of the problems was given by the review entitled “Partons, hadrons and theoreticians muddling through the QCD vacuum”.
Nevertheless, theorists were confident that the long-sought-after, nonlinear QCD effects could be found and studied in a broad kinematic range. The relative rate of so-called diffractive, or rapidity gap, events discovered in electronproton scattering at HERA a few years ago should be much larger in electronnucleus collisions and approach 50%. The production of J/psi and upsilon particles as small quarkantiquark systems that can be used as probes of the nuclear medium they propagate through and reveal strong absorption effects that are characteristic of the new QCD dynamics.
P Paul, Brookhaven deputy director for Science & Technology, stunned many in the audience by announcing the interest in an electronnucleus collider at the RHIC machine at Brookhaven, which is now being commissioned. However, the energy would be about a factor of 10 lower than the energy that could be reached at HERA. It was recognized that the heavy-ion collision programmes at RHIC and CERN’s LHC have important connections to electronnucleus physics, for example in the study of gluon screening effects and establishing safe signals for the quarkgluon plasma.
Several groups are being formed to study these issues in depth. Regular meetings are planned as well as coordination with studies of the RHIC electronnucleus option.
Arriving at CERN from Novosibirsk’s Budker Institute are magnets for the two transfer lines to feed the LHC collider with protons from the SPS proton synchrotron.
Some 360, 6 m dipoles and 180, 1.4 m quadrupoles will be installed in two new underground transfer tunnels, each about 3 km long, linking the SPS and LHC/LEP tunnels. One tunnel is being built by the Swiss as part of its host state contribution for the LHC.
To equip the tunnels, 10 magnet consignments per month will cover the 6000 km from Siberia over the next 18 months, each bearing two dipoles and a quadrupole.
Unlike the LHC’s main magnets, these are not superconducting. The Budker Institute supplies them under the 1993 Co-operation Agreement, which covers Russian participation in the LHC.
Preliminary work for dipole elements is handled by the Efremov Institute, St Petersburg, and for quadrupoles by the ZVI factory in Moscow. Additional manufacture and final assembly for the magnets is done at Novosibirsk.
Hall 888, of the north area of CERN’s SPS synchrotron, is muon beam country. For almost two decades this hall hosted the European Muon Collaboration (EMC) spectrometer (companioned downstream by the NA4 apparatus), subsequently adapted to the needs of the NMC experiment and then the SMC experiment. Using CERN’s high-energy muon beam and a variety of targets, these experiments provided a wealth of insights into the quark/gluon content of nuclear particles.
Their successor will be Common Muon and Proton Apparatus for Structure and Spectroscopy (COMPASS), proposed in March 1996 by a large community of physicists with a keen interest in nucleon structure and hadron spectroscopy. The experiment aims to address remaining questions using all of the artillery available today.
One central issue is to look at the contribution of gluons to the nucleon spin. EMC and SMC made decisive advances towards the understanding of the nucleon spin in terms of its constituents, but the role of the gluon needs to be clarified.
The other major physics objective is to look for particles such as glueballs, composed of gluons rather than quarks, quarkgluon hybrids and quarkantiquark combinations. Such exotica have long been searched for and a few candidates have been identified, but nothing like the rich spectrum expected from theory.
The experiment was approved in February 1997 and the construction of new detectors is proceeding fast. Key features of the new spectrometer (actually a two-stage spectrometer, to allow for large geometrical and dynamical acceptance) are:
full particle identification (charged particles using high granularity RICH ring-imaging Cherenkov counters);
calorimetry for energy measurement;
high rate (beam intensities of 108 particles per pulse).
Coping with such a high beam rate is the main feature of the new spectrometer. On the detector side, many novelties will be implemented:
for the first time a large quantity of an unusual material (Li6D) will be used for the polarized target;
large-area trackers using “straw tubes” at large angles;
Micromegas developed at Saclay will cover the central part of the first spectrometer;
a small-area tracker of the “double GEM” type (CERN Courier December 1998 p19) will cover the central part of the second spectrometer;
the Cherenkov photons in the RICH will be detected with a large array (6 m2) of wire chambers with caesium iodide photocathodes, a new technique developed at CERN in the RD26 project.
Swallowing data
Such a voracious appetite for data influences the detectors, the data acquisition system and data storage and analysis.
COMPASS will be able to trigger 105 times per second and store 104 events per second, each typically 30 Kbytes, for a total data size of 300 Tbytes peryear at a mean acquisition rate of 35 Mbytes per second. Data will be sent via an optical link directly to CERN’s main computer centre using the Central Data Recording (CDR) facility pioneered for the NA48 CP violation experiment.
The estimated power needed to process COMPASS data is five times that of the already impressive supercomputer used by NA48.
Still, the quantity of data that COMPASS will handle is such that a host of new problems had to be faced and solved quickly for data-taking next year. Handling the stream of data propelled by the CDR system is a major challenge.
The estimated power needed to process COMPASS data is five times that of the already impressive supercomputer used by NA48. The analysis plan foresees processing the data at CERN, while almost all final physics analysis as well as most of the simulation will be done in the collaborating institutes. The performance of COMPASS computing and analysis will be a useful guide for the high-rate experiments at CERN’s LHC collider which are scheduled to begin operation in 2005.
New software tools
Fortran has had its day, and a move from top-down structured programming to object-oriented programming is in sight. In object-oriented analysis and design, software systems are modelled as collections of co-operating objects, treating individual objects as instances of a class within a hierarchy of classes.
Compared with the well known “top-down structured” programming, object-oriented programming looks more abstract and moves the complexity of development to the first step the definition of classes and relations. On the other hand, object-oriented programming, with its encapsulation, polymorphism and inheritance features, helps the maintainability of elaborate software, particularly when many authors are involved.
Codes
For an experiment that will be active over many years, and with off-line computing having to keep pace with incoming data, a new off-line code was called for, with object-oriented compatibility.
Of the many programming languages that support object-oriented programming, only two are widespread: C++ and Java.
The main reason behind the C++ success is its backward compatibility with the C language which is used extensively for on-line and system applications. This also means that C libraries can easily be used within a C++ program. C++ limitations mainly come from the necessity to be compatible with C, also allowing structured top-down programming.
Java is a pure object-oriented programming language, but it is still under development with no compilers available. Right now, C++ looks like the best choice for an object-oriented programming language.
Data and databases
Handling extremely large data volumes and rates is the key feature of COMPASS. C++ provides only low-level access to disk files. In particular, there is no means of managing tape input/output or, in general, tertiary storage devices at the language level.
Database programs can provide extended disk input/output power, giving a consistent framework with many new functions. As well as plain data, the C++ language is capable of handling objects and object collections.
To take advantage of C++ means storing and retrieving structures (which can be very complicated and can evolve with time) in a transparent way and without adding too much complexity.
The most natural choice is object-oriented databases that use C++ language and can handle memory and disk transparently. CERN plays a leading role in the development and use of such systems in high-energy physics. The RD45 collaboration was set up five years ago and COMPASS (as well as the NA45 heavy-ion study at CERN and BaBar at SLAC, Stanford) are taking advantage of this work.
Given 1010 events per year and all of the associated complexity calls for:
minimal duplication of information;
seamless access to the data from different sources (events, calibration and alignment);
direct access to specific parts of the event information for some selected sample;
transparent data access from local and remote sites.
Currently the most promising candidate for these tasks is Objectivity/DB, and in 1997 COMPASS decided to use this commercial product to store all data for off-line analysis, keeping them under “federated” databases (consisting of separated files on different computers).
The internal structure of the database should allow easy access to the physical quantities needed in the analysis without external bookkeeping. One major advantage is the possibility of “tagging” events by physics properties. Users should thus be able to select subsets of data and access the full information.
The transparent navigation among the events and other analysis objects is very attractive, and Objectivity/DB promises to be able to handle a very large “federation” of database files.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.