Comsol -leaderboard other pages

Topics

CERN workshop studies electron clouds

cernecloud1_7-02-2

By replacing its Large Electron Positron collider with a proton-proton collider, CERN will be able to generate much higher energy collisions for physicists to examine. The amount of energy lost to synchrotron radiation by particles on curved paths decreases with the mass of the particles, and is therefore much less for protons than for electrons. Synchrotron radiation nevertheless poses a problem for designers of high-intensity proton accelerators, since although the energy loss is less, the number of photons emitted can actually be higher and their energy increases with the cube of the beam energy. These photons can lead to a number of undesirable phenomena, including heating and gas desorption from the vacuum chamber walls. Perhaps the most difficult to deal with, however, is photoemission of electrons from the vacuum chamber, which at 7 TeV in the Large Hadron Collider (LHC) is the dominant mechanism of electron generation and can lead to the establishment of an electron cloud that can cause beam deterioration.

Electron-cloud phenomena have been observed at many accelerators around the world, including CERN where LHC-type beams in the proton synchrotron and super proton synchrotron (SPS) have generated clouds. In the LHC bending arcs at full energy, the process begins with synchrotron radiation photons emitted in a narrow band striking the outside wall of the accelerator’s vacuum chamber. The majority liberate electrons, which are turned back by the dipole magnetic field and reabsorbed. Some photons, however, are reflected and go on to liberate electrons from the top or bottom of the vacuum chamber. These electrons are accelerated by the charge of a passing bunch of positively charged particles and can go on to free further low-energy electrons from the opposite wall of the chamber. If a sufficiently large fraction of low-energy electrons survives long enough, successive passing bunches lead to a runaway effect known as multipacting, which generates the electron cloud.

A copper-coated beam screen will be installed within the vacuum chamber of the LHC. This serves to carry away heat, and also controls the electron cloud in the dipole magnets by limiting the number of reflected electrons. The pressure increase caused by the electron cloud, its impact on beam diagnostics and, for the LHC, the heat load on the beam screen and cold bore are further primary concerns. Surface conditioning by electron bombardment will rapidly lower gas desorption and secondary electron yield of the beam screen surface. When electron multiplication is sufficiently reduced, it will no longer compensate for electrons lost between two successive bunches, and there will be little or no build-up of the electron cloud. This principle has recently been demonstrated at the SPS.

Future machines

The CERN workshop brought together some 60 participants from 17 institutes to discuss electron-cloud simulations for proton and positron beams. Simulations for future linear colliders and intense proton drivers suggest that in these machines, electrons in the vacuum chamber may reach densities some 10-100 times higher than in existing machines. Workshop participants reviewed a number of simulation codes that have been developed using different approximations and including different physics. Key aims of the meeting were to review current analytical, simulation and modelling approaches to the electron-cloud problem, determine the important outstanding questions, and develop a strategy for future studies. Reports on the current status of experimental observations worldwide served as a motivation and benchmark for the simulation studies.

Experimental work carried out at many different laboratories in Europe, Japan and the US was reported in the two opening sessions of the workshop. Results from laboratory measurements of secondary electron emission and electron energy spectra – an invaluable input for the electron-cloud modelling – were also discussed. Presentations on simulations of electron-cloud build-up and associated beam instabilities included the physics models that form the basis of existing simulation codes, simulation results and comparisons of simulations and observations. Two sessions concentrated on future studies, including plasma physics approaches, and on possible remedies to electron-cloud problems.

Summarizing the workshop, Weiren Chou of Fermilab highlighted the need to strengthen international collaboration on electron-cloud effects. A tangible result of the workshop was the establishment of a few key contact people who have agreed to coordinate future worldwide activities related to laboratory measurements, theoretical approaches and simulation-code comparisons.

D0 physicists ‘shake hands’ across the Atlantic in key grid test

cernnews5_6-02

In an important test of datagrid technology, members of the D0 collaboration at Fermilab have successfully communicated across the Atlantic with colleagues in the UK. The aim of the grid is not only to make it possible to access data remotely on different machines, but also to enable data processing to take place on remote machines.

A vital first step in achieving this goal is to allow individuals wishing to access the grid to identify themselves and show that they are authorized users. A two-way trust must be established between the individual and the machine that is being used.

In the tests, carried out in February, Fermilab exchanged files with Lancaster University, UK and Imperial College, London, after the transfers had been authenticated using certificates issued by the Department of Energy ScienceGrid and the UK High Energy Physics Certificate Authority.

The “firewalls” installed in many computer systems are making it increasingly difficult to access computers remotely, which is the antithesis of the philosophy behind the grid. The authentication system is intended to provide a means of allowing secure access so that the grid can operate effectively. In February’s tests the certificates were used to establish trust between users and machines at Fermilab, Imperial College and Lancaster.

Although in this case the users were members of the same collaboration, the transfers took place as though the users were completely unknown to one another. This approach was used to test the Globus Toolkit – the software tool that was used to build the authentication system.

This software is currently being developed by the US-based Globus Project to bring about the higher level of computer access that will be essential if the grid is to fulfil its promise in a wider context.

Smooth surfaces give boost to TESLA superconducting cavity

cernnews8_6-02

Accelerating cavities for the proposed TESLA superconducting electron-positron collider routinely achieve the 25 MV/m accelerating gradient that is required to achieve a collision energy of 500 GeV in a 33 km collider. Now, for the first time, a nine-cell TESLA cavity has achieved 35 MV/m, which would be sufficient to operate the accelerator at 800 GeV.

The secret of this achievement is a collaboration between Hamburg’s DESY laboratory, which is driving the TESLA project, and the KEK laboratory in Japan. Until now, the niobium surface of the TESLA cavities has been chemically etched. Through the DESY-KEK collaboration, the new cavity has been electropolished by the Nomura Plating Company, resulting in a smoother surface. Moreover, baking the cavities at 100°C has been shown to lead to further improvements, with 40 MV/m being reached in single-cell cavities.

An electropolishing facility for TESLA cavities is scheduled to be commissioned at the DESY laboratory in June.

Fermilab turns up the heat on electron cooling

cernecool1_6-02

In February Sergei Nagaitsev and his group at the US Fermi National Accelerator Laboratory (Fermilab) reported a breakthrough. Working on an ambitious electron-cooling project, the team set a new world record for DC beam power – they maintained a continuous 3.5 MeV electron beam with a current of more than 500 mA for up to 8 h with only short interruptions.

These figures may not, at first sight, seem significant. After all, half an amp is the current flowing through a typical light bulb. However, the beam electrons travel at a much higher energy than those in an electric wire, leading to a record beam power of about 2 MW in the short prototype beamline.

Nagaitsev’s group aims to use an electron beam to cool antiprotons inside Fermilab’s 3 km Recycler antiproton storage ring and boost the luminosity of the laboratory’s Tevatron collider. When the electron-cooling system is complete, electrons and antiprotons will travel side by side in the Recycler. The electrons will absorb the excess heat of the antiprotons, shrinking the size of the antiproton beam. To be efficient, the electron beam must contain many more particles than the antiproton beam, requiring scientists to develop a high-current electron system.

The cooling process only consumes a fraction of the 2 MW beam power because scientists can recirculate the electrons and their power. The electrons start at the top of an 8 m high Pelletron accelerator – a Van de Graaff-type device developed by the National Electrostatics Corporation (NEC) – where they gain energy by travelling through a 3.5 MV electrostatic accelerating tube. They then pass through a loop and re-enter the Pelletron, where they are decelerated by traversing the electrostatic field in the opposite direction. A beam collector at the top of the Pelletron receives the electrons and supplies them for re-acceleration. Only a few electrons, about 20 in every million, are lost each trip. A 200 mA Pelletron-charging current is sufficient to ensure stable operation of the recirculation system and to restart beam recirculation within 20 seconds if the machine trips off. The Fermilab recirculation system is unique in sustaining such a high current with so little loss at an energy that is significantly more than a few hundred kilo-electron-volts.

A versatile machine

NEC, a Wisconsin-based company that received a Small Business Innovation Award in 1984 from the US Department of Energy, has made more than 140 Pelletrons and sold them in 38 countries. The machine gets its name from the chains of metal cylinders – pellets – that replace the belts of conventional Van de Graaff generators.

Pelletrons are used in applications such as surface analysis and doping of computer chips. But the machines are also valuable beyond the field of physics. The new security inspection system for the Channel Tunnel, for example, uses two Pelletrons to produce X-rays for scanning loaded trucks and containers. Pelletrons are also used for carbon dating in accelerator mass spectrometry.

Most Pelletrons operate as non-recirculating accelerators, typically featuring one-way beams of less than 50 mA. In contrast, Fermilab’s electron-cooling project relies on a continuous high-current beam, which can only be achieved through recirculation. “People in this business know how hard it is,” said project leader Nagaitsev. “Everybody is pushing the envelope. People working on related projects in the US and Europe are waiting for our results. Our success or failure means quite a bit at other laboratories.”

More collisions

cernecool3_6-02

With the help of electron cooling, Fermilab scientists will create a larger number of collisions inside the Tevatron. “The goal of our R&D project is simple – construct and commission an electron-cooling device that is ready to be moved to the Fermilab Recycler,” said Nagaitsev. A dedicated building to be located next to the Recycler is already being designed to house the electron-cooling equipment.

Nagaitsev’s team is currently working in a building more than a kilometre away from the Recycler. So far, electrons haven’t mingled with a single antiproton as the team is still making improvements on operating the Pelletron, producing an electron beam in stable mode for long periods of time. The Fermilab group plans to increase the beam energy to 4.3 MeV and the current to more than 1 A. So far, they have attained a 750 mA current for short periods of time.

The next step is to improve the quality of the electron beam as it travels through a special cooling section – initially without the presence of antiprotons. Only when that is achieved will the electron beam be used to cool antiprotons. “Depending on the efficiency of the Recycler,” said Nagaitsev, “maybe we can increase luminosity by a factor of two, maybe more.”

A test beamline with a nine-module cooling section is currently being incorporated into the Pelletron recirculation loop. This will enable the Fermilab team to study the electron beam carefully in the environment of the cooling section, determining the exact beam energy and the size of the high-current beam. Ultimately the electrons must travel parallel to the antiprotons, so the challenge is to put electron and antiproton beams on top of each other to within 50 mm.

In the final phase of the project, anticipated for 2003 or 2004, scientists will install the 20 m cooling section in the Recycler ring and send electrons and antiprotons through the cooling section at the same time. If everything works well, each antiproton will find itself surrounded by a cloud of electrons. Antiprotons going too fast will slow down as they bump against electrons in front of them. Antiprotons going too slow will speed up as electrons kick them from behind. With each collision, the lighter electrons will reduce the spread of energy within the antiproton beam. All of this will happen in a gentle way, since the masses of the particles make the collisions reminiscent of ping-pong balls bouncing off a bowling ball.

Cool idea

cernecool2_6-02

Gersh Budker first proposed the idea of electron cooling in 1966. It was first tested in 1974 at the Institute of Nuclear Physics in Novosibirsk, Russia, using proton beams. In 1976, David Cline, Peter McIntyre and Carlo Rubbia proposed using electron cooling for antiproton beams at Fermilab. Due to technical difficulties with cooling hot antiprotons, Fermilab turned to stochastic cooling, an alternative beam cooling technique developed at CERN by Simon van der Meer. The electron-cooling equipment went to the Indiana University Cyclotron Facility, where the equipment is still in use and provides electrons with a maximum energy of 300 keV.

CERN also developed electron-cooling systems, starting with the ICE ring in the late 1970s. CERN’s Low Energy Antiproton Ring (LEAR) used a 30 keV electron-cooling system from 1992 until 1996. Today, low-energy electron-cooling systems are used successfully at many facilities around the world. The Fermilab team is the first to develop the technique for electrons in the MeV range.

cernecool4_6-02

Potential applications for a recycled electron beam go beyond the world of particle physics, and the Fermilab result is attracting the attention of Free Electron Laser (FEL) builders around the world. FELs are powerful light sources that have many applications in molecular biology, materials science and chemistry. Rather than throwing away the electrons and their energy, recycling the beam could allow scientists to produce laser light with little electrical power input. Scientists at the University of California Santa Barbara have worked on Pelletron-driven FELs and beam recovery systems since the early 1980s using pulsed electron beams. The group has also looked at low-current continuous beam options, which were a precursor to the Fermilab project.

In particle physics, the future for electron cooling of antiproton beams looks bright. Stochastic cooling is limited, and to decrease beam temperature further, electron cooling is needed. Sergei Nagaitsev’s team has taken a big step in that direction. Some day, cooling antiprotons may be as easy as switching on a fridge. Although there is still a long way to go, it might be time to start chilling some champagne.

New detectors for physics at a new mass scale

cernmeg1_6-02

Great achievements have been made in recent years using the very large underground detectors initially designed to observe possible proton decays. The pioneers in this field were KOLAR in India, IMB in the US, Kamiokande in Japan, and NUSEX and Fréjus in Europe, the largest of which used an instrumented mass of the order of 1 kiloton. Through the non-observance of proton decays it was possible to reject the simpler versions of grand unification theories (which unify the weak, strong and electromagnetic interactions under the same theory). However, the actual results harvested were far more plentiful – a detailed study of atmospheric neutrinos (produced by cosmic rays in the Earth’s atmosphere), which constituted the main background of the sought-after decays, provided hints of an anomaly in the flavour composition of these neutrinos. These measurements have since been refined by the 50 kiloton Superkamiokande detector, which has unambiguously established the existence of flavour oscillations, where muon neutrinos transform into tau neutrinos.

In 1987 the Kamiokande and IMB detectors, both water Cerenkov counters with an energy threshold of only a few MeV (made possible by the high level to which the water was purified), observed a small burst of neutrinos coming from supernova 1987A. This observation opened the way for neutrinos to be used as the messengers of the universe, an avenue since pursued by experiments such as AMANDA at the South Pole and ANTARES, which is currently being deployed in the Mediterranean Sea. Thanks to their very low energy thresholds, Kamiokande and Superkamiokande have also been able to measure the neutrino flux from the Sun and confirm that a large deficit exists, which also hints at neutrino oscillation. This interpretation was confirmed in spectacular fashion in June 2001 and reinforced this April by results from the Sudbury Neutrino Observatory’s unique 1 kiloton heavy water Cerenkov detector in northern Ontario, Canada (Direct evidence seen for oscillations), bringing positive evidence that neutrinos have mass.

cernmeg2_6-02

These advances have led physicists to consider the possibilities offered by detectors not of 1 or 50 kilotons but of 1 megaton. Such devices would be capable of detecting proton decays at the very weak rates predicted by the latest supersymmetric grand unification theories. They would allow the tens of thousands of neutrino interactions produced by a supernova in our galaxy to be observed in a matter of seconds, thus supplying a wealth of information on explosion mechanisms. Such detectors could also act as targets for the neutrino superbeams currently being studied in Japan, the US and Europe.

A workshop on megaton detectors was held at CERN in January to take stock of developments in this field. It brought together 65 participants from Europe, 25 from America and 10 from Asia. The speakers addressed three main themes in detail – proton decay, supernovae and superbeam-assisted neutrino oscillations – from both theoretical and experimental points of view. They described very large Cerenkov detector projects and the underground sites that would potentially house them. Japanese, American and French engineers discussed the problems associated with digging deep underground cavities with a volume of 1 million m3. Suggestions were also made for alternatives to the Cerenkov technique. These included a liquid-argon detector along the lines of the ICARUS detector being prepared for Italy’s Gran Sasso underground laboratory, but with a mass of up to 100 kilotons, and the specialized OMNIS detector, for which a UK-US collaboration proposes a lead target for supernova neutrinos. Fine-grained calorimetric detectors that may be envisaged for high-energy neutrino superbeams were also discussed.

cernmeg3_6-02

On the experimental side, the workshop brought together two communities of physicists – those interested in neutrinos and those interested in non-accelerator physics – which now require similar types of detector. The convergence of several themes around a single detector capable of addressing them all can only strengthen these ambitious projects, which require worldwide collaborations. The theoretical implications of both proton decay research and a fine study of neutrino mixing are of fundamental importance as they open up windows beyond the Standard Model. These subjects can tell us about the structure of grand unification theories and baryogenesis, topics that must be understood to explain the domination of matter over antimatter in our universe.

As far as proton decay is concerned, in the e+p0 decay channel a 500 kiloton water Cerenkov detector can provide a sensitivity of 1035 years for the lifetime of the proton, compared with the current limit of the order of 5 ¥ 1033 years. The K+n decay channel, favoured in certain scenarios, is a more difficult task for water Cerenkov detectors, and for this decay a more ambitious 100 kiloton liquid-argon detector could achieve a similar sensitivity.

With megaton detectors, the search for supernovae can be spread to neighbouring galaxies (we would observe some 20 neutrinos for an explosion in the Andromeda galaxy), and this has the virtue of increasing the rate of visible explosions to one every 10 years.

Neutrino superbeams

The most recent subject of study concerning the potential of megaton detectors relates to neutrino superbeams. These are similar to current neutrino beams in their method of production, but will use much more intense primary proton beams. Such proton supermachines are currently being considered for a range of applications including hybrid nuclear reactors and waste reprocessing plants, spallation neutron sources, intense sources of radioactive nuclei, and as the first component of a neutrino factory or a muon collider. These are low-energy machines (at the GeV scale) with about 1 MW of power (compared with several hundred kilowatts at machines such as CERN’s SPS). These machines could supply superbeams of unrivalled intensity for neutrino research.

Today we know that neutrinos have a mass and that they mix, giving rise to the oscillation phenomenon. But there is one oscillation we have yet to observe – that which links muon and electron neutrinos with the frequency observed for atmospheric neutrinos. This oscillation is weak since it is governed by a mixing angle (q13) that, thanks to the French CHOOZ reactor-based experiment, we know to be lower than 10°. The entire neutrino factory programme hinges on the existence of this tiny oscillation. Neutrino superbeams will therefore be used to demonstrate its existence for q13 values down to 1°. Such an observation would be reassuring before the construction of a neutrino factory can be launched.

cernmeg4_6-02

To this end, the Japanese will soon have a 0.8 MW proton accelerator at the Japan Atomic Energy Research Institute, and in 2007 plan to send a neutrino superbeam to the Superkamiokande detector 300 km away. A sensitivity of 2.4° for q13 is expected. To get the sensitivity down to 1°, Japanese physicists envisage increasing the proton power to 4 MW after 2012 and building a 1 megaton detector, Hyperkamiokande, on the Kamioka site.

In Europe, CERN’s superconducting proton linac (SPL) project aims to supply 2.2 GeV protons at 4 MW by 2012. The optimal baseline for studying the oscillation in question corresponds to the distance between CERN and the Fréjus laboratory. The forthcoming construction of a gallery parallel to the existing road tunnel at Fréjus provides the opportunity to dig a cavern capable of housing a megaton detector comparable to the UNO underground nucleon decay and neutrino observatory being designed in the US. Such a project would provide a sensitivity of 1° on q13. The SPL would also allow long-lived radioactive nuclei to be produced and stored in storage rings whose straight sections, aimed at the Fréjus underground site, would supply electron-neutrino beams that, in conjunction with the muon-neutrino superbeam, would open up the possibility of studying time-reversal symmetry-breaking in the neutrino sector.

US scientists are looking into using the beam that will be sent from Fermilab to the MINOS detector in 2005 and increasing the proton power from 0.4 to 1.6 MW. A superbeam is also being considered at Brookhaven. It will still be necessary to dig a cavity to house a very large detector, and this could be done at the Homestake mine (which has recently been transferred to the Department of Energy) or elsewhere. The current Fermilab beam will provide a sensitivity of 4° on q13 using a 5-20 kiloton detector. Future projects could lower this limit to around 1.5° by using either UNO or a 70 kiloton liquid-argon detector.

Looking to the future

cernmeg5_6-02

Although none of these plans has yet been approved, it seems likely that such a project will be realized. It will form a key component of the global fundamental physics infrastructure of the future, and its location remains to be decided. Europe has a strong track record in the field, but the Japanese project, in its first phase using the Superkamiokande detector, is currently the most advanced. The workshop held at CERN provided the opportunity for a first round of discussions between Europeans, Americans and Japanese with a view to worldwide collaboration around this Japanese first phase. If no oscillation is discovered it will then be necessary to move into the megaton phase, and the siting of such a detector is, for the moment, an entirely open question. Multipurpose megaton detectors will address many open physics issues, and promise, in parallel to CERN’s Large Hadron Collider and the electron-positron linear colliders, to provide exciting results in the decades to come.

Particle physics software aids space and medicine

cerngeant1_6-02

Simulation programs play a fundamental role in optimizing the design of particle physics experiments. In the development of reconstruction programs, they provide the necessary input in the form of simulated raw data. In the analysis process they are required to understand the systematic effects resulting from detector resolution and acceptance, as well as the influence of background processes. The predecessors of the Geant4 toolkit – which were written in the now almost obsolete Fortran language – were successfully used at CERN for experiments at the laboratory’s Large Electron-Positron collider and for the design of experiments for the Large Hadron Collider (LHC).

Geant4 was launched as an R&D project in 1994 to demonstrate the suitability of object-oriented programming technology for large software projects in particle physics. The initial collaboration of members of particle physics institutes around the world has since been joined by scientists from the European Space Agency (ESA) and members of the medical community.

The Geant4 software toolkit was designed to simulate particle interactions with matter for particle physics. It contains components to model in detail the geometry and materials of complex particle detectors. The simulated particles are propagated through magnetic and electrical fields and through the materials of the detectors. The core of the program contains information on numerous physics processes that govern the interactions of particles across a wide energy range. Visualization tools and a flexible user interface are available as separate components. Rigorous software engineering makes Geant4 open to change in a rapidly evolving software environment, while at the same time ensuring that it can be easily and fully maintained over the lifetime of large-scale experiments.

Accurate simulations

Geant4 was publicly released in December 1998 and has since been further developed. All Geant4 code and documentation is openly available via the Web. At a recent conference on calorimetry in particle physics at the California Institute of Technology, US, the quality of Geant4’s simulation of the response of electromagnetic and hadronic showers in calorimeters was demonstrated in comparisons of test-beam data with simulation. One of the speakers for the ATLAS experiment (currently in preparation for the LHC) concluded that Geant4 is mature enough as a toolkit, with sufficient physics for electromagnetic showers implemented, to be considered for large-scale detectors.

Other speakers reported on the first results of ongoing comparison projects of hadronic interactions in calorimeters. These first results look very promising. In fact, Geant4 is used in production for the BaBar experiment at the Stanford Linear Accelerator Center, US, and more than 300 million events have been simulated already. This, together with the fact that Geant4 applications are as fast as similar Fortran-based applications, shows that object-oriented technology is capable of standing up to the challenge.

Simulation is equally important in space-based astroparticle physics. Most space probes need to be able to operate for many years without the possibility of physical repair after launch. It is therefore essential to be able to predict the behaviour of all components in the space environment, and in particular to judge the likely effect of radiation on on-board electronics and detectors. The availability of the ISO standard for the exchange of product data (STEP) interface in Geant4 is especially advantageous, as the use of professional computer-aided design tools is commonplace in the aerospace industry.

Geant4 was first used for space applications by ESA in 1999, when ESA and the US National Aeronautics and Space Administration (NASA) each launched an X-ray telescope. Both telescopes follow highly eccentric orbits, reaching at their far point one-third of the distance to the Moon. NASA’s Chandra was launched in July 1999. During the initial phase of operation, some of the front-illuminated charge-coupled devices (CCDs) experienced an unexpected degradation in charge-transfer efficiency. ESA scientists, who had been planning to launch their X-ray multi-mirror (XMM) Newton Observatory in December 1999, needed to understand the possible origin of this problem to protect their detectors from similar damage.

cerngeant2_6-02

The geometries of both telescopes, including the concentric mirror systems, were described using the Geant4 toolkit. Particles, in particular low-energy protons trapped by the Earth’s magnetosphere in the Van Allen radiation belts, were simulated entering the apertures of the telescopes. The simulation revealed that these particles are scattered at shallow angles from the mirror surfaces and are focused onto the surface of the sensitive CCD detectors, completely bypassing the collimators and other elements that were supposed to shield the devices.

cerngeant3_6-02

This simulation explained why NASA’s emergency measure to move the detectors out of the focal plane during the passage of the radiation belt prevented any further degradation. With the Geant4 study’s input, the operational procedures of XMM Newton were arranged so that the detectors were powered off during the passage of the radiation belts for about 8 h of the 48 h orbit. Both telescopes now deliver magnificent scientific data.

The dose estimation by simulation of the International Space Station (ISS) radiation environment project (known as DESIRE) aims to use Geant4 to calculate radiation levels inside the Columbus ISS module and to estimate the radiation doses on the astronauts. Apart from assessing the risk involved in space missions from exposure to radiation, Geant4 plays an important role in evaluating the performance of particle detectors. For the ESA BepiColombo mission to Mercury, currently planned for launch in 2009, detectors will analyse the spectrum of fluorescence from planetary material induced by solar flares. Using Geant4, the spectra and expected detector response have been simulated and the optimization of the detector technology in the severe radiation environment close to the Sun is under way.

Medical applications

Geant4’s extended set of physics models, which handle both electromagnetic and hadronic interactions, can be used to address a range of medical applications from conventional photon-beam radiotherapy to brachytherapy (using radioactive sources), hadron therapy and boron neutron capture therapy. The tools for describing geometries, materials and electromagnetic fields can precisely model diverse real-life configurations. An interface to the Digital Imaging and Communications in Medicine (DICOM) standard will soon make it possible to import computer tomograph images directly into a Geant4 geometrical model. The quality-assurance methods applied in Geant4, its open-source distribution and its independent validation by a worldwide user community are particularly important in the medical domain.

Geant4 can play a significant role in estimating the accuracy of radiotherapy treatment planning, exemplified by comparisons of its simulations with commercial software and experimental data. One study exploited Geant4’s accurate simulation of electromagnetic interactions down to very low energies to account precisely for effects resulting from source anisotropy. The same method has also been applied to calculate dose distribution for certain superficial brachytherapy applicators where no other treatment-planning software is available.

cerngeant4_6-02

Other studies have exploited Geant4’s capability for precision-modelling of geometries, materials and physics processes to provide accurate dose distributions in heterogeneous geometries. High-precision dose evaluation is important because, in some tumour sites, a 5% under-dosage would decrease local tumour-control probability from around 75% to 50%. As with typical physics applications, in which simulation is used to optimize the design of particle detectors, Geant4 has allowed the optimization of brachytherapy seeds, improving the treatment’s effectiveness while sparing surrounding healthy tissue. The suitability of Geant4 has been demonstrated in advanced radiotherapy techniques, such as intra-operatory and intensity-modulated radiotherapy. Several projects also apply Geant4 in the domain of radiodiagnostics. Possible future extensions include modelling the effects of radiation at the biomolecular level.

Geant4 is developed and maintained by an international collaboration of physicists and computer scientists. The open and collaborative relationship between the development team and its user communities has led to a two-way transfer of technology, with users from fields other than particle physics actively contributing. The expertise of the biomedical and space user communities in simulation has resulted in many significant contributions to Geant4 in areas such as testing and validation, as well as extensions of functionality. These developments bring valuable enhancements to Geant4’s applications in particle physics.

Systems engineers find their way to San Jose

The 8th biennial International Conference on Accelerator and Large Experimental Physics Control Systems, ICALEPCS 2001, was held in San Jose in November and hosted by the Stanford Linear Accelerator Center (SLAC). Some 270 control systems specialists took part, with around 120 staying on for three workshops covering databases, experimental physics and industrial control systems (EPICS) and automated beam steering.

cernical1_5-02

The ICALEPCS conferences attract participants from a broad range of subject areas. As well as particle accelerators and detectors, other large facilities such as telescopes, fusion devices and nuclear reactors are also represented. Both hardware and software aspects are covered. The 2001 meeting saw participants coming from Africa, the Americas, Asia and Europe, representing 62 laboratories and seven companies. With such a broad base, focused workshops traditionally follow the conference. Last year, Roland Müller from Berlin’s BESSY synchrotron laboratory organized one for the international accelerator database group, Greg White of SLAC convened another on automated beam steering and shaping, and Bob Dalesio of the Los Alamos National Laboratory organized a third on EPICS.

Widening applications

The broadening range of applications of control systems for large-scale facilities was demonstrated by the large number of first-time contributions from non-accelerator facilities. From the field of nuclear fusion, the laser-driven approach was represented by a talk on the Lawrence Livermore National Laboratory’s National Ignition Facility (NIF) project, while the magnetic confinement approach was represented by Princeton’s National Spherical Torus Experiment (NSTX). Talks on astronomy covered the European Southern Observatory’s Very Large Telescope (VLT) and the VISIR mid-infra-red spectrometer being built for it, the Gemini South telescope in Chile, the Italian Osservatorio Astronomico di Capodimonte, and the Atacama Large Millimetre Array project. Participants from the LIGO gravitational wave observatory discussed the control system for its two interferometers, which are separated geographically, but operated as a single observatory. Traditional ICALEPCS ground was covered by presentations about CERN’s ATLAS and CMS experiments, BaBar at SLAC, D0 and CDF at Fermilab, H1 at DESY and KLOE at Frascati.

The increased level of collaboration across different controls projects was noteworthy. Engineers from facilities such as the Spallation Neutron Source (SNS) being built at the US Oak Ridge National Laboratory, the Swiss Light Source (SLS), the Visible Multi-Object Spectrograph for the ESO’s VLT and the NSTX are establishing a common development environment and sharing packages, modules, designs and experience. Papers on commercial control systems included the Argonne Tandem Linear Accelerator System, which uses the commercial VSystem, the CMS and H1 experiments, both of which have adopted the PVSS II system, and Frascati’s DAFNE accelerator, which uses LabVIEW.

Established technology

Control systems for large facilities are no longer the high-risk endeavours they used to be. Trained and experienced people, tools, equipment and bandwidth are all now widely available. Several status reports at the conference on recently commissioned control systems gave ample confirmation of this. The SLS control system, for example, has proven itself to be highly reliable. It comprises 100,000 data channels with 150 electronics crates running EPICS software, and uses common object request broker architecture (CORBA) to provide a commercial (or freeware) interface and management of connections between software objects. The control system for Brookhaven’s Relativistic Heavy-Ion Collider (RHIC) is another success story. It incorporates commercial hardware with software written in the C++ and TCL languages, and has many automated features such as ramp control, sequencing and tune feedback.

cernical2_5-02

The H1 experiment’s detector control and monitoring control system, based on PVSS II, successfully completed its final test phase with the delivery of a prototype for control of the H1 detector’s high-voltage system, superconducting solenoid and luminosity monitoring system. The BaBar online team reported on the use of Objectivity/DB, an object-oriented database management system. Other reports on running facilities came from NSTX, which achieved first plasma in February 1999, and the D0 experiment, which has solved the problem of mismatch between the Oracle database format and EPICS’s ASCII files.

Several very large systems in the construction phase were also discussed. The $2.5 billion (€ 2.9 billion) NIF project with its 192 laser beams delivering 1.8 MJ is scheduled to be fully operational by 2003. The strategy used to develop its integrated computer control system calls for incremental cycles of construction and testing to deliver a predicted total of 1 million lines of code. There is a development process with clearly defined roles and responsibilities, productivity measures, extensive documentation, regular assessment, and design and code reviews. NIF’s commitment to formal testing with an independent testing team and facility is an exciting development. It requires management commitment and money, but is expected to pay off with the availability of systems that meet real functional requirements and have fewer bugs.

Accelerator science needs more brain power

Galileo’s remark, “measure what is measurable, and make measurable what is not so”, says it all with respect to contemporary quantitative science. He gives pride of place not only to measurement, but also to extending the means of measurement beyond its current circle of light.

cernview1_5-02

While these principles are broadly respected, the latter tends to be somewhat narrowly construed to our detriment. The particular example I have in mind is that of accelerators – although there are others. The range of applications of accelerators to science – and technological applications as well – has grown steadily, particularly with the rapidly expanding use of synchrotron radiation for the life, materials and engineering sciences, to say nothing of the growing use of accelerators for neutron production in these same fields. There are two excellent reasons for the stakeholders in these fields to take a more active role in this part of instrumentation development:

• the control of increasing facility costs needs intellectual input;

• the need for new capabilities in pushing forward the frontiers in the various sciences demands involvement by those who best understand these capabilities.

Accelerator history

Before proceeding, it would be best to review the history of accelerator science and technology. The first half of the 20th century saw rapid development of the various accelerator types we use today. With a few exceptions, these developments were driven by the scientists who needed them for their research, both scientists working in university environments and those in the larger facilities that began to grow after the Second World War. With the scientific need for higher and higher energies enabled by the discovery of the alternating gradient principle and the development of systematic design methods for accelerators, a specialization of labour developed in which accelerator science became an identifiable speciality diverging from nuclear and particle science. This, coupled with the closing of most university accelerators, forced by the need for ever larger facilities, has effectively removed intellectual involvement in accelerator development from most university campuses (with a few notable exceptions). To continue with the example of particle physics, today only 13% of experimental particle physicists in North America claim involvement in accelerator work, and two-thirds of them reside at national laboratories. By contrast, three-quarters of experimental particle scientists reside at universities. The health of accelerator-based science depends on redressing this imbalance in intellectual centres of gravity.

While the laboratory structure that has developed, driven by these trends, has been hugely successful, the need for reconsideration is apparent. Progress at the energy frontier of particle science is now strongly compromised by the cost of the required facilities. Cost is also a factor in facilities for radiation production for the life, materials and engineering sciences – though not yet as urgent as for particle science. However, technical advances in improving brightness, coherence and time structure of radiation-producing accelerators are needed to continue advancing on the important frontiers. These advances need the intellectual input of those who know exactly what characteristics are needed, and who are capable of matching technical possibilities to these needs.

One often hears that the culture changes implied by these observations cannot take place because the subjects are not themselves accelerator specialists – how can one contribute to such a mature and well developed field dominated by experts? The point is that the problems to be solved and the concepts to be developed have significant components outside of the traditional accelerator science and technology purview – just the sort of instrument-developing activity that good experimental scientists have always engaged in. Of course the specialists are needed, but new ideas “outside the box” are required. It should be obvious that at this stage of world science, more intellectual input into this part of instrument development is needed. It’s a matter of perspective. University and lab scientists, and their cultural underpinnings, need to see themselves in this picture if we are to continue the progress that can be afforded by the use of accelerators.

Superkamiokande to be rebuilt this year

Following an accident last November at the Superkamiokande detector in Japan, the experiment’s spokesman, Yoji Totsuka, vowed that the detector would be rebuilt. Investigations carried out since then have shown a way to prevent such accidents happening again, and rebuilding is now under way.

cernnews3_5-02

Superkamiokande is a huge water Cherenkov detector in which some 11,200 photomultiplier tubes view 50,000 t of pure water 1000 m underground. It seems that shockwave propagation from a single tube imploding could have sparked off a chain reaction that destroyed the detector. The short-term solution is to encase the tubes in 13 mm acrylic plus fibre-reinforced plastic bubbles, which would contain the implosion. The recovery plan is to deploy some 47% of the full complement of tubes before the end of the year, allowing operation to resume with the K2K neutrino beam, sent from the KEK high-energy physics laboratory, early next year. Tubes will be deployed in such a way as to maximize the effectiveness of the detector for observing the K2K beam.

cernnews4_5-02

For the longer term, the Superkamiokande collaboration will be carrying out research and development into photomultiplier technology, studying aspects of glass shape and structure. The detector is scheduled to be fully rebuilt by 2007, in time for commissioning of a neutrino beam to be sent from the new Japan Hadron Facility.

ICFA sets up international committee for linear collider project

At a meeting held in February at the Stanford Linear Accelerator Center, the International Committee for Future Accelerators (ICFA) announced that an international steering committee would be set up to promote a 500 GeV linear collider. This move reflects a growing consensus in the global high-energy physics community that such a machine should be the next major facility to follow CERN’s Large Hadron Collider. Reports published by the Asian and European Committees for Future Accelerators and the US High-Energy Physics Advisory Panel all recommended a 500 GeV electron-positron linear accelerator, designed, built and operated as a fully international collaboration. The committee will be made up of members from separate Asian, European and North American steering groups, with more members from other countries. Its first meeting will be held during the 31st International Conference on High Energy Physics in Amsterdam this July.

bright-rec iop pub iop-science physcis connect