The new generation of “B-factory” electron-positron colliders at SLAC, Stanford (PEP-II), and the Japanese KEK laboratory (KEKB) have both made good progress and exceeded luminosities (a measure of the collision rate) of 1033/cm2/s – previously uncharted territory for electron-positron colliders. PEP-II achieved this figure late last year and KEKB reached it in February.
PEP-II achieved its first collisions in July 1998, but the BaBar physics detector did not appear until May 1999. KEKB and its BELLE detector began operation last June. The aim of these colliders is to mass-produce B particles (containing the fifth “b” quark), and the new collision rates are good news for the physicists, who hope to see the first signs of CP violation in a B particle setting.
The previous world record electron-positron luminosity was 8 x 1032, held by the valiant CESR ring at Cornell, which is still in the race. CERN’s 27 km LEP collider is in a different league because of its size, and here the focus is on achieving maximum collision energy.
ISIS, the major facility at the Rutherford Appleton Laboratory (RAL) in Oxfordshire, UK, is the world’s most powerful pulsed spallation neutron source. Since 1984 it has provided beams of neutrons and muons that have enabled the structure and dynamics of condensed matter to be probed on a microscopic scale ranging from the subatomic to the macromolecular, from a proton wavefunction to a protein structure.
Neutron production
Construction of the source was approved in 1977, following a proposal by UK scientists who saw an opportunity to build a world-leading neutron facility replacing the aging NIMROD proton accelerator at the then Rutherford Laboratory. In contrast with the traditional means of neutron production by nuclear fission, which involves the production of a continuous stream of neutrons, ISIS was to be a pulsed neutron source, similar to but much more intense than the existing IPNS source at Argonne National Laboratory in Illinois, US.
First, H– ions would be accelerated in a pre-injector column to 665 keV, then passed into a linear accelerator consisting of four accelerating RF cavities, reaching an energy of 70 MeV. At the point of injection into the final acceleration stage (a 52 m diameter proton synchrotron), the electrons would be stripped from the H– ions by a 0.25 mm alumina foil, to produce a circulating beam of protons.
At full intensity, 2.5 x 1013 protons per pulse would be accelerated to 800 MeV, before being extracted and sent to a heavy metal target, producing a burst of neutrons by spallation. This whole process would then be repeated 50 times per second.
As a result of the low duty cycle of the ISIS accelerator, the time-averaged heat production in the ISIS target would be a modest 160 kW, but, in the pulse, the neutron brightness would exceed that of the most advanced steady-state sources. In addition, the structure of the neutron pulse would be exploited using time-of-flight measurement techniques and white neutron beams, thereby providing a direct determination of the energy and wavelength of each neutron detected. The duty cycle of the accelerator would also ensure good signal-to-noise levels.
The first neutrons were produced in late 1984 and ISIS was officially inaugurated in October 1985. The facility reached its design specification of delivering 200 mA pulses to the target station in 1993, and it has run more or less consistently at this level since then.
Over the past 15 years, ISIS has attracted substantial international investment and has developed into a major international force in condensed matter research. It has seen its complement of instruments rise from 6 to more than 20 and its user base from 200 to more than 2000. This popularity reflects the fact that the neutron is in many ways the ideal probe for the study of solids and liquids.
The citation for the 1994 Nobel Prize for Physics to Brockhouse and Schull for their pioneering work in neutron scattering put this point succinctly – neutrons simultaneously probe the structure and dynamics of matter, bringing insight at an atomic and molecular level about where atoms “are” and what atoms “do”.
Structural and dynamical studies at ISIS have had a major impact at the cutting-edge of materials development at both the fundamental and the applied levels. ISIS has been heavily involved in many of the most exciting stories of recent years, including the physics “Woodstock” of high Tc superconductors and the discovery of a new form of carbon, C60 (Buckminsterfullerene). On the applied side, work at ISIS underpins the development of materials such as batteries, detergents, catalysts, pharmaceuticals and polymers.
Over the past 15 years the problems tackled at ISIS have become ever more diverse and challenging. The desire to collect more data faster has become irresistible, as areas such as in situand time-resolved studies on increasingly complex systems become more important. To handle this trend, detector arrays have expanded enormously from those originally installed, employing “spin-off” technology borrowed from high-energy physics techniques.
Next-generation instruments at ISIS, such as the new MAPS spectrometer and the GEM diffractometer, now include detector arrays with areas as large as 16 m2 – orders of magnitude larger than those available in 1984 and containing more than 50 million data points per measurement. Much of the development of these new neutron detectors, both in terms of front-end construction and signal encoding, has been undertaken at RAL.
In tandem with these large detector arrays, the data acquisition and storage systems at ISIS must also be state of the art to handle the huge volumes of data generated. ISIS instruments operate to a common data acquisition framework based on the RAL-developed electronics. The ability to develop, build and support such advanced systems in house has again relied a great deal on experience gained from RAL’s historical and continuing involvement in other aspects of high-energy and particle physics.
Looking to the future
While the trend towards massive detector arrays is one way of increasing the number of neutrons utilized in an experiment, developments in the synchrotron ring are taking place that will increase the current that can be delivered to the target to 300 mA.
This involves the addition of a second harmonic to the existing accelerating RF waveform, achieved by the insertion of four new RF cavities into the existing ring. As well as benefiting all of the instruments clustered round the existing target station via increased neutron production, this enhanced current can be shared with a second target station optimized for the production of longer-wavelength cold neutrons, opening up new research opportunities in fields such as complex macromolecular assemblies, magnetism, colloid and surface chemistry, high-resolution diffraction and the biological sciences. Furthermore, the enhanced current will be essential if the SIRIUS project, which aims to utilize the spallation source as a method of producing radioactive nuclei for post-acceleration, is to become a reality.
Non-proliferation through scientific co-operation is the mission statement of the International Science and Technology Centre. The centre was established in 1992 under an agreement between the European Union, Japan, the Russian Federation and the US. Since 1992, other member nations have joined: Norway, Korea, Armenia, Belarus, Georgia, Kazakstan and Kirgizstan.
Based in Moscow, the International Science and Technology Centre (ISTC) provides ex-weapons scientists from former Soviet Union (FSU) countries with the opportunity to refocus their talents on peaceful activities. These activities may include helping to solve national and international technical problems; supporting the transition to market-based economies; contributing to basic and applied research; and encouraging the integration of other former weapons scientists into the international scientific community.
In the years 1992-9, ISTC programmes funded 830 projects worth US$230 million, providing grant payments to more than 30 000 workers. Particle physics and CERN played a valuable role in bringing together scientists and promoting understanding during the Cold War, and this role is continuing in the ISTC.
The main thrust of the ISTC programme is to support projects for FSU centres in collaboration with foreign firms or organizations. A significant proportion of the funding comes from ISTC sources. Each project is assigned to a CERN-familiar Russian “lead institute”, which acts as a gateway to a frequently unfamiliar new supplier.
This traditional ISTC programme was extended in 1997 by the Partnership Programme, under which western organizations fund research and development to be conducted on their behalf at FSU centres via the ISTC. In this framework the outside funding is channelled by the ISTC, which also provides the necessary infrastructure and management inside the FSU region.
Overall, Partnership Programme contracts now exist with almost 60 partners, who work in the electrical, biomedical and chemical industries, as well as at research centres such as CERN. ISTC director-general Alain Gérard declared: “CERN-ISTC co-operation in high-energy physics continues to be a shining example of the unique and effective nature of the Partnership Programme, and it is a model for our future activities with CERN and other FSU institutes.”
The ISTC “business”, under both the traditional ISTC activities and thePartnership Programme, represents only a fraction of the total CERN-Russia collaboration. However, it signficantly extends its involvement to include additional institutes and a wider pool of expertise.
Case-studies
One of the first major projects under the ISTC banner began in 1994. This was a feasibility study of technologies for the accelerator-based conversion of plutonium and long-lived radioactive waste.
The Russian lead institute was Moscow’s Institute for Theoretical and Experimental Physics (ITEP), with six other Russian centres also being involved. The partners were CERN and the US Los Alamos National Laboratory, both of which have demonstrated the feasibility of using particle beams from accelerators to transmute weapons grade plutonium and to reprocess nuclear waste.
Directly involved in CERN’s mainstream programme of research was an ISTC project for the design and construction of a cryostat and vacuum windows for a large liquid krypton calorimeter. The calorimeter is a key element of the NA48 CP violation experiment, which recently announced its initial results. Arranged through the Joint Institute of Nuclear Research (JINR) in Dubna, Moscow, the project involved the Khrunichev State Space Science and Production Centre, and ENTEK in Moscow. INFN Pisa was also a major partner.
Development work for experiments at CERN’s future LHC collider is the theme of several major ISTC projects. Special computer systems to facilitate LHC detector design is the goal of VNIITF in Snezhinsk, with JINR also being involved. Recently the Snezhinsk institute assumed responsibility for the construction of major support structures for the ATLAS experiment, also involving JINR Dubna, IHEP Protvino and MPI Munich.
The inner tracker and forward multiplicity detector (FMD-MCP) for the ALICE experiment at CERN involves TsKBM of St Petersburg as the lead institute, together with JINR; the Nuclear Physics Institute in Gatchina, St Petersburg; the Kurchatov Research Centre, Moscow; the Mendeleev Institute of Metrology, St Petersburg; and several other St Petersburg concerns, as well as the university. On the CERN side, INFN Ferrara and Utrecht University provide the main interface.
For CMS, the production of lead tungstate crystals for the electromagnetic calorimeter was investigated by the Bogoroditsk Techno-Chemical plant. After this pilot study, mass-production of the 80 000 crystals will be shared by the Bogoroditsk plant and the Shanghai Institute of Ceramics.
For the endcap of the ATLAS hadron calorimeter, the main Russian partner is the Institute of High-Energy Physics (IHEP) in Protvino, Moscow, which is working with NPO Molniya of Moscow. The main collaborators are the Max-Planck Institut in Munich and CERN.
Several items for LHC detectors involve mass-production, such as the injection moulding for 0.5 million 200 x400 mm transparent scintillation tiles to clad the ATLAS hadron calorimeter. IHEP in Protvino is the main Russian coordinator, with two specialist Russian concerns. For ATLAS, the project is handled by the Laboratorio de Instrumentaçao e Fisica Experimental de Particulas, Lisbon.
Experimental astrophysics is all about detecting photons from distant objects, and over an increasingly wide range of wavelengths. Progress in the field advances hand in hand with advances in photon-detection technology, particularly superconductors.
The absorption of a photon in a superconductor is followed by a series of fast processes that involve the breaking of Cooper pairs by energetic phonons created by the hot electrons produced as the atom relaxes after the initial photoabsorption. The result of this cascade is that the photon’s energy is converted into a population of free charge – quasiparticles – in excess of any thermal population.
For typical transition metals, this conversion process ranges from nanoseconds (niobium) to microseconds (hafnium). At sufficiently low temperatures (typically about an order of magnitude lower than the superconductor’s critical temperature) the number density of thermal carriers is very small while in a superconductor, such as tantalum, the initial mean number of free charge carriers created is much greater.
The quasiparticles produced through photoabsorption can be detected by applying a DC potential across two such films separated by a thin insulating barrier, forming a superconducting tunnel junction (STJ). This potential bias favours the transfer of quasiparticles from one film to the other via quantum mechanical tunneling across the barrier. The detector signal is therefore represented by the current developed by this tunnel process.
After initial tunnelling, a quasiparticle can, moreover, tunnel back, contributing many times to the overall signal before it is lost. This also boosts performance.
The overall limiting resolution for a STJ depends on the characteristics of the superconductor, but it is predictable. Figure 1 illustrates this resolution for a number of elemental superconductors.
High-resolution X-ray spectroscopy provides the ability to determine the electron and ion temperatures, the electron density and the relative abundance of the elements, as well as establishing the degree of thermal and ionization equilibrium in a hot plasma.
While the measurement of the intensity of the hydrogenic and helium-like lines from the same element is an important ion temperature indicator, it is the ability to resolve the satellite lines that can determine the key characteristics of the X-ray-emitting plasma in a model-independent manner.
Figure 2 shows the response of a tantalum STJ to the large complex of lines (the Fe-L complex) around 1 nm, which is expected to be radiated from an optically thin plasma having a temperature of approximately 107 K. In this example, continuum emission has been suppressed for clarity. As the majority of lines are easily resolvable with such a tantalum STJ, measurement of the relative intensity of the lines from the same ion enables the temperature to be uniquely determined. In addition, through the relative intensity of lines from different elements, their relative abundances can be established.
Note that the intensity ratio of resonance lines from different ions of the same element, together with line centroids, allows one to deduce either the degree of ionization equilibrium or possibly the distance to the object via the determination of the redshift. A high spectral resolution is required for such observations. This resolution can be achieved using a tunnel-limited tantalum STJ but is impossible with conventional solid-state devices.
…and at ultraviolet and optical wavelengths
In optical and ultraviolet spectroscopy, a high resolution normally implies a resolving power of 104. From figure 1 it is clear that none of the classical superconductors forming the basis of current STJs under development (based on niobium, tantalum, aluminium, molybednum or hafnium) could achieve such resolving power.
In fact, a superconducting critical temperature well below 100 mK is implied to achieve such resolving power, leading to the development of STJs based on such elemental superconductors as rhodium. Of course, things are not quite this simple, with the temporal characteristics associated with the production of the free excess charge carriers being a function of the critical temperature, while phonons have wavelengths that are significantly larger than the thickness of the film. Thus such low temperature superconductors may respond significantly slower.
Given that the resolution of a typical STJ based on tantalum is not appropriate for high- or even medium-resolution spectroscopy, what are the alternative key attributes of such a device for optical/ultraviolet astronomy?
Timing precision (below 10 ms), coupled with the broadband spectral capability, may make this the ideal spectrophotometer. Objects such as pulsars and flare stars may be ideal objects to observe with narrowfield small arrays. In addition, the efficiency at ultraviolet wavelengths, coupled with a large-format array (a panoramic detector), may allow for the development of an efficient broadband imaging spectrometer to determine the low-resolution spectra of faint objects, allowing for deep field surveys.
Such surveys could allow the determination in a single exposure of the broadband spectra and possibly therefore the redshift (and thus age) of all objects through the measurement of the Lyman edge and the Lyman emission lines – the “Lyman forest”.
Owing to redshift, the Lyman edge is close to the optimum performance for a tantalum-based STJ with an efficiency of some 70% and a resolution of 20 nm.
However, it is clear that STJ devices based on lower-temperature superconductors such as hafnium would allow the clear evaluation of redshift. Of course, the response of the STJ in the ultraviolet is particularly attractive for future space-based astronomy missions.
Tantalum-based STJs build on earlier work with niobium. The predictability of both tantalum and niobium devices (cf. figure 1) give some confidence in the ultimate successful development of lower-temperature elemental superconducting tunnel junctions, such as those based on hafnium.
Figure 3 illustrates the measured spectra from a tantalum STJ forming part of a 6 ¥ 6 array illuminated by monochromatic radiation of various wavelengths. The array is shown in Figure 4, where each device was 25 ¥ 25 mm and consisted of two films each 100 nm thick. Only those photons absorbed in the base film, separated from top film and substrate events by their distinct signal risetime, are shown. Typical resolutions of 0.015 nm (3.5 eV) at 2.4 nm (~500 eV) were measured and are indicated in figure 1.
At optical and ultraviolet wavelengths, where the photon energy is small, spatial effects, which degrade the response below 0.5 nm, are unimportant. Here it is rather that the signal is low, so the signal-to-noise ratio is the dominant factor governing the resolution.
At these wavelengths the photons enter the detector through the substrate, which can be either sapphire or magnesium fluoride, depending on the short wavelength cut-off required. The theoretical efficiency of a tantalum device deposited on a sapphire substrate with this mode of illumination is high. All photons are absorbed in the high-quality epitaxial tantalum base film. Efficiencies of 70% from 200 – 600 nm are expected, limited at short wavelengths by the cut-off of the sapphire substrate, and these have been experimentally confirmed.
To illustrate the broadband response of this type of detector, figure 5 shows the charge spectrum from a single tantalum-based device illuminated with optical light via a grating monochromator. This response ranges from 296 to 1183 nm – from the ultraviolet to the near-infrared.
Precise determination
Not only are the various orders well resolved, but the charge output as a function of wavelength can be precisely determined leading to a high wavelength linearity. This allows wavelength resolution to be determined across a broad waveband, and this is shown for both tantalum- and niobium-based devices in figure 1.
The astronomical capabilities of this unique detector have now been demonstrated. Figure 6 shows the light curve of the Crab pulsar obtained from a 6 ¥ 6 pixel optical array installed in early 1999 at the William Herschel Telescope, La Palma.
All photons are detected individually from this pulsar, a spinning neutron star with a period of 33 ms some 6000 light-years away, and their energy (wavelength) and arrival time recorded. This opens up the pulse phase spectroscopy of exotic objects, and demonstrates the feasibility of the technique on space-based missions, covering a range of wavelengths – from 0.5 nm to 2 mm (approximately 0.5-2500 eV).
Radiofrequency is the motive power of particle accelerators and, in the continuing bid for higher energies, superconducting techniques are increasingly being used to squeeze out the maximum number of electron volts from the available power.
The biennial Radiofrequency Superconductivity Workshop reflects this progress and focuses attention on new goals. The ninth workshop was organized by Los Alamos on 1-5 November 1999 at Santa Fe, New Mexico, under chairman Brian Rusnak. About 180 participants registered, a little fewer than the previous workshop, but with an ever-increasing participation of representatives from industry. The latter also presented their latest high-tech products – cavities, couplers, high-quality niobium and related fabrication tools.
The workshop began with operations and laboratory review talks. It opened with the achievements of the superconducting radiofrequency (RF) system for CERN’s LEP2 electron-positron collider, which is currently the largest system in the world with about 3500 MV of “superconducting voltage”.
The niobium film on copper (Nb/Cu) cavities clearly outperform the specified field gradient. This performance gives LEP beams of up to 102 GeV, a level that not even optimists could dare to dream of when the first modules were installed. Even above design specifications, the system runs very reliably and about 15 MW of RF power is regularly transmitted to the beam.
CEBAF, the continuous-wave electron-recirculating linac at the Jefferson Laboratory, Newport News, routinely delivers beams at 5.5 GeV, well above the 4 GeV that was originally specified.
ATLAS, the versatile heavy-ion accelerator in operation and extension at Argonne, the more recent ALPI at Legnaro and other smaller machines regularly accumulate superconducting cavity hours.
In Cornell’s CESR electron-positron collider, the four existing copper five-cell cavities have been replaced by four single-cell superconducting cavities, with strong higher-order mode damping to allow the beam current and luminosity to be increased significantly.
The international TESLA project
Many advances in production and material testing result from the international TESLA project and its test facility at DESY (working as a free electron laser).
R&D continues to yield dividends. The spinning of a whole multicell cavity from a seamless tube, and non destructive metal sheet probing (for example, to detect a foreign metal by the tiny magnetic field of the thermocurrent produced when a localized area of a niobium sheet is heated), were reported.
The stiffening of cavities – to reduce Lorentz force deformation and detuning – with external thermal copper spraying was examined. Controlling these pulsed cavities in amplitude and phase to maintain beam quality is not trivial. However, a working system, using digital controllers with feedback and feed-forward, has been built and tested successfully.
A more difficult task will be the control and operation of the proposed superstructure of four seven-cell cavities, connected by short beam pipes and driven by a single power coupler at one end.
The nine-cell TESLA design accelerating gradient of 25 MV/m has become routine at the test facility. Even better, single-cell lab tests at different places approach gradients of close to 40 MV/m.
Until recently, electropolished and buffered chemically polished cavities seemed to show the same performance, the latter method being simpler. However, already advertised by the Japanese KEK Lab at the last workshop, electropolishing shows higher gradients and the “holy” heat treatment at about 1400 ºC becomes redundant.
High-pressure water rinsing has become a standard finishing surface treatment, and other liquids – for example, detergents – have been proposed. Also, a weak bakeout at only 145 ºC improved a cavity significantly. The progress is still encouraging.
These achievements are complemented by a better understanding of the superconducting surface. A major effort for the Nb/Cu cavities at CERN has given a better picture of the influence of RF and external magnetic fields responsible for the decrease of the resonance Q-value with higher RF field (Q-slope). This Q-slope makes today’s Nb/Cu cavities – the thermal stability of which is otherwise very attractive – non-competitive for very high gradients.
A parallel effort correlated copper surface treatment and film production parameters – for example, using different noble gases for sputtering – with their composition and RF performance.
A quadrupole-resonator test cavity has been built to probe surface resistance. The Q-slope is also present in a weaker form for the solid niobium cavities, so several models have been proposed to explain the effect. However the jury is still out.
Improvements
High-power couplers – several with adjustable coupling – have been improved at many laboratories. All of them reach the several hundred kilowatt range in continuous-wave operation at frequencies between 350 MHz and 1.3 GHz.
After a successful application in LEP, most coaxial couplers include an option for DC bias voltage to suppress multipacting. Despite their increased performance, power couplers remain the bottleneck in high-current machines, limiting the useful cavity gradient to far below today’s possibilities.
For CERN’s LHC 400 MHz single-cell cavities, the operational gradient is only 5.5 MV/m. All 21 bare cavities have been fabricated by industry with proven LEP2 Nb/Cu technology. They have all been tested successfully, ready for assembly into cryomodules.
Accelerated electrons become relativistic very soon in a linac, but the much heavier protons are significantly slower than the velocity of light in much of the machine. Thus, modified spherical cavity shapes with reduced cell length (reduced beta cavities) have been designed, built and successfully tested at several laboratories for high current proton machines. (At CERN’s LHC, the protons will already become relativistic in the injection chain.)
The Q-value and accelerating gradient of these cavities is intrinsically lower compared with cavities for fully relativistic beams. Despite lower accelerating gradients, a high-quality superconducting surface is as important here as it is for high-gradient cavities.
There are several studies for new high-current proton linacs in the range of a few hundred mega-electron volts to 1-2 GeV – for spallation sources or nuclear waste transmutation – that have a normal conducting and a superconducting option. The proven reliability of today’s large superconducting RF systems – running for many months without real interruption – should help to convince funding agencies of the value of the latter option.
The last day of the workshop looked at projects like the TESLA linear collider proposal and an upgrade of CEBAF to 12 GeV, and possibly to 30 GeV. After LEP is closed, the LEP2 RF system could be recommissioned in an Electron Laboratory for Europe recirculating-electron linac project. Even a possible muon collider with superconducting RF was presented.
The tenth workshop will be held in 2001 in Japan and will be organized jointly by JAERI and KEK. The chairperson will be Shunichi Noguchi from KEK. The community is already convinced that there will be a lot more to report.
The first major contract for the CMS experiment at CERN’s LHC was completed last month when the last of 120 forged iron blocks rolled off the production line at the Izhora factory, St Petersburg. The blocks, weighing up to 41 tonnes, will make up the experiment’s barrel magnet yoke. The occasion was marked by a ceremonies and a press conference in St Petersburg on 12 January.
Izhora won the contract following an international call to tender by CMS magnet contractor, German firm Deggendorfer Werft und Eisenbau GmbH. The contract, valued at DM 4 million, was for 3500 tonnes of forged and machined iron split into 120 blocks and produced in five batches for the five rings of the CMS barrel yoke. One of the reasons that Izhora won the contract is that there are few factories in the world that are capable of forging blocks on the scale required by CMS.
The blocks travel from St Petersburg to Deggendorf near Munich, where they are further machined and assembled. The trial assembly of the first ring was in September 1999 and the blocks for the final ring are now on their way to the German factory. After test assembly at Deggendorf, they will be shipped to CERN for final assembly in July.
Izhora has already supplied iron for experiments at Brookhaven, DESY and Fermilab, as well as for the Delphi experiment at CERN.
In Experimental Hall B at the Jefferson Laboratory, Newport News, Virginia, the CEBAF Large Acceptance Spectrometer (CLAS) has opened a new “window” on the building blocks of matter: mesons, nuclei and nucleons. CLAS enables detailed studies of the spectrum of nucleon-excited states. This is a source of vital information about the nucleon’s constituents and the forces between them.
The large-acceptance CLAS serves experiments that require the simultaneous detection of several loosely correlated particles in the hadronic final state, and measurements at limited luminosity. It collects data at the unprecedented rate of 3000 events per second, which is substantially higher than its design goal of 1500 events per second. Six superconducting coils generate its toroidal magnetic field.
A seven-year collaboration between 34 institutions in the US, France, Italy, Armenia, Korea, the UK and Russia built CLAS for use in Hall B – the last of three experimental halls to become fully operational at Jefferson Lab’s Continuous Electron Beam Accelerator Facility (CEBAF).
The superconducting radiofrequency CEBAF accelerator, which was originally designed for 4 GeV, but which is now delivering up to 5.5 GeV, provides three simultaneous continuous-wave beams of independent current and independent but correlated energies.
With Jefferson Lab seeking to bridge the gap between quark and hadronic descriptions of nuclear matter, CLAS’s operation fits into a three-hall programme of complementary experiments guided by quantum chromodynamics, the fundamental theory of quark interactions. The laboratory’s earliest experiments began in Hall C in late 1995.
CLAS enables particles to be tracked and identified, and their energy, momentum and initial direction to be defined. It does this by using information provided by drift chambers, Cherenkov counters, scintillation counters and electromagnetic calorimeters.
CLAS will be a crucial tool in Jefferson Lab’s investigation of the quark-gluon structure of the nucleon and, in particular, will facilitate the detailed study of its spectrum of excited states. As in atomic physics, the spectrum of this system contains vital information about the nature of the nucleon’s constituents and the forces between them.
It is not clear why the naive constituent quark model is so successful in explaining the particle spectrum discovered so far. CLAS will either support this model by discovering the complete pattern of states that it predicts, or it will reveal the model’s shortcomings.
The University of Liverpool has just commissioned a major computer system that is dedicated to the simulation of data for current and future scientific experiments.
One of the largest in Europe, the system comprises three hundred 400 MHz PCs running under Linux. The primary role of the computer system is to simulate large numbers of events to help to optimize the design of the central vertex detector for the LHCb experiment at CERN’s LHC proton collider.
The Monte Carlo Array Processor (MAP) is now fully commissioned and produces more than 250 000 fully simulated events per day. All of the components of the system are low-price commodity items packed into custom rack-mounted boxes. The mounting ensures minimal space requirements and optimal cooling.
The power of MAP reflects the simplicity of its architecture, with essentially all of the PCs dedicated to one job. A custom control system and protocol written at the University of Liverpool has enabled very reliable communication between the “master” and the “slave” nodes on the 100BaseT internal network.
A small fraction of the system is reserved for system development and it is hoped to use this to test direct node to node communication. This would enable MAP to handle problems of much wider applicability than just event simulation.
The project will provide an insight into the operation of large-scale PC arrays planned for the LHC as well as providing the LHCb collaboration with sufficient computer power for its vertex detector optimization studies.
Despite its power, MAP is still a long way from being a general-purpose machine for analysing real or simulated data. A potential solution to the storage and analysis of large amounts of data is to store the output of the experiment on large disk servers.
The Liverpool team has tested a prototype 1 TByte server on loan from Dell Computers, UK. Unlike standard RAID architectures, there are no specialized hardware components but simply 1 Tbyte of SCSI disks attached to a high-performance server. This has the benefit of low cost compared with standard systems, and it is hoped to equip MAP with such a storage system to test its operation in such an environment.
Spin-oriented (polarized) electron beams are high on the agenda at the Jefferson Laboratory, Newport News, Virginia. To ensure uninterrupted delivery of these beams, a second polarized electron gun has been added.
About 50% of all the Laboratory’s experiments require polarized beams and an even larger fraction of the major experiments use them. With the Continuous Electron Beam Accelerator Facility (CEBAF) delivering beams to two or three halls simultaneously, in practice the polarized source has to run 100% of the time. The Laboratory has been running polarized beams since April 1998 and plans to continue into the spring of 2001.
Several other improvements were made to the polarized source. For example, in the past, temperature fluctuations near the lasers that illuminate the photocathodes affected beam stability. An air-conditioned housing has now been built around the three lasers. Improved laser controls and electronics hardware were also installed. Work is in progress to reconfigure the laser systems so that switching beam delivery between the polarized guns can be done with the push of a button.
A key component of both electron guns is their dime-sized gallium arsenide photocathode. These photocathodes gradually lose their emitting properties over time. Although the laser can be refocused on different sections of the photocathode, eventually, the material’s effectiveness decreases and the entire crystal must be replaced.
During replacement, Injector Group personnel must open the ultra-high-vacuum chamber within the injector. Although this process lasts only minutes, re-establishing the ultra-high vacuum involves bakeout and can take up to 50 hours.
With two polarized guns, one can be taken out of operation as necessary. With planned upgrades to the laser system and continued investigations into more efficient and durable photocathode materials, researchers should be able to take full advantage of their allotted beam time in the coming months and years.
Neural computation – analysing data by simulating the way the brain works – is another area of application where high-energy physics is in the vanguard of development. This state-of-the-art physics was demonstrated at the recent Neural Computation in High-Energy Physics Workshop, NCHEP-99, held in Ma’ale Hachamisha (near Jerusalem), Israel. The workshop was organized by Halina Abramowicz and David Horn of Tel Aviv and gathered together 30 participants from 10 countries. The workshop showed that the brain-modelled computational techniques, with which high-energy physicists have been experimenting for 10 years now, have definitely come of age.
Highlights on the triggering front included a status report, presented by Christian Kiesling of MPI Munich, on the much-heralded neural net trigger of the H1 experiment at DESY. This trigger was essential in extending the acceptance of the photon-proton energy for elastic J/psi production over the entire kinematic range of DESY’s HERA electron-proton collider (figure 1), and well beyond that obtainable using the standard techniques. Another such highlight was a talk, presented by Joao Varela of LIP Lisbon, on a modular neural electron/photon trigger for the level-1 trigger of the CMS experiment at CERN’s LHC collider, which is benchmarked at 40 MHz in a test beam.
Talks, to place neural techniques within the broader spectrum of high-energy physics techniques, also included: the Silicon Vertex Tracker, now well under way at the CDF experiment at Fermilab (Franco Spinella, INFN Pisa); and overviews, presented by Grzegorz Wrochna of Warsaw and Saul Gonzalez of CERN, respectively, of the LHC experiments’ CMS and ATLAS trigger systems. Possible future applications, presented by Erez Etzion and Gideon Dror of Tel Aviv, included a muon transverse momentum trigger for ATLAS and a z-vertex postion finder for ZEUS.
Finally, Bruce Denby of Versailles and Jose Seixas of Rio de Janeiro gave overviews of neural network hardware platforms, which included Digital Signal Processors (DSPs) and Field Programmable Gate Arrays.
Neural network methods have now become accepted as part of the standard toolbox of off-line data analysis techniques. This was witnessed by two D0 (Fermilab) results, a leptoquark production limit (presented by Silvia Tentindo-Repond of Florida State), and a precision measurement of the top quark mass (presented by Harpreet Singh of UC Riverside). DELPHI and OPAL members also talked about further studies in the Higgs quest, rare B decays, and tau-exclusive branching ratios. Anatoli Sokolov of IHEP, Moscow, gave a comprehensive overview of off-line applications, as well as some hints for the future.
At the end of the workshop, David Horn presented highlights of the Neural Computation in Science and Technology Conference, which had taken place at the same venue immediately before NCHEP-99. According to him, if high-energy physicists wish to remain up-to-date in this field, they will have to familiarize themselves with some of the more modern techniques such as Independent Component Analysis, Support Vector Machines, advanced clustering techniques, and genetic optimization.
NCHEP-99 was sponsored by the Israel Science Foundation.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.