Bluefors – leaderboard other pages

Topics

Making Tevatron magnets

cernnews14_4-01

A new magnetic system for Fermilab’s Tevatron collider has been designed and built in a collaboration between Fermilab and the Institute for High-Energy Physics (IHEP) in Protvino, near Moscow. The Tevatron Electron Lens (TEL), to be installed in the proton-antiproton collider ring, produces a solenoidal field to focus an electron beam. This affects the antiproton beam, compensating beam-beam effects.

The TEL magnetic system was fabricated at IHEP, and tested at Fermilab last autumn. It consists of seven superconducting magnets (a solenoid and six steering dipoles) and two conventional solenoid magnets equipped with correction coils.

The electron beam from an electron gun cathode is transported through the interaction region in a strong field of the superconducting solenoid and absorbed in the collector. The main TEL element is the superconducting solenoid, which is 2.5 m long with a 6.5 T field. The solenoid coil (inner radius 76 mm) is made of flat transposed cable consisting of 10 wires (niobium-titanium filaments in a copper matrix), each 0.85 mm in diameter.

With 7289 superconducting coil turns the inductance was 0.6 H (operating current 1.8 kA, stored energy 950 kJ). Studies of the quench processes have shown that the coil is not self-protected against resistive transition and some protective precautions, namely fast quench detection and removal of stored energy to the external dump resistor, must be taken.

In the first high-current test of the superconducting solenoid, 5.64 T was reached at the current ramp rate of 3 A/s. The second quench took place at 6.6 T at 3 A/s, and after that the solenoid could not be quenched up to 6.7 T at 10, 20 and 30 A/s (the maximum allowed by the power supply). The magnet quenches very quietly and does not consume much helium at the quench.

Two field-measuring systems record the magnetic field distributions. The first uses a Hall probe in three dimensions, which records the field in every direction for each position. The second tracks field lines by a magnetic rod. A small trolley holds a freely rotating magnetic rod with a mirror. This trolley is moved inside the solenoid, and small deviations in the local field move the mirror and reflect the laser beam.

Six steering superconducting dipoles are placed on the outer surface of the superconducting solenoid coil. All are for correction of the electron-beam trajectory inside the magnetic system. The superconducting solenoid and the steering dipoles are enclosed in a yoke of low carbon steel. This improves the magnetic-field homogeneity inside the solenoid aperture. All superconducting coils together with the steel yoke are enclosed in a helium vessel. The TEL cryostat is part of the Tevatron magnet string cooling system.

Finnish technology takes on CERN’s data mountain

In the early 1990s CERN was confronted with a big problem – how to manage the estimated 2.5 million documents needed to build its proposed new accelerator, the Large Hadron Collider (LHC). Fortunately a solution was at hand in the form of a novel distributed information system developed at the laboratory by Tim Berners-Lee and colleagues – the World Wide Web.

The Web, in combination with an initiative set up at the Helsinki Institute of Technology (HUT), has led to the successful transfer of technology and know-how from CERN to the young Helsinki-based company Single Source Oy.

When the LHC project got under way, HUT’s Institute of Particle Physics Technology surveyed competencies available in Finland to identify areas where the country could best contribute. Among their finds was a group at the university’s Institute of Industrial Automation that was studying the development of business processes in large international companies.

LHC testbed

cernfinn1_4-01

The LHC, as one of the largest international projects that has ever been undertaken, provided an ideal testbed for the group’s nascent ideas, so the project director Ari-Pekka Hameri, together with many of his staff, relocated to CERN. In 1996 they launched TuoviWDM (the Tuovi Web Data Management project). A Finnish girl’s name, Tuovi takes it name from the Finnish acronym for product process visualization.

The TuoviWDM project provided the Web interface to CERN’s commercially-supplied Engineering Data Management System, in which all LHC-related documents reside. The project also interfaced naturally with CoDisCo (the Connecting Distributed Competencies project), run by a consortium of Nordic industrial companies funded by the Nordisk Industrifond. CoDisCo used CERN as a case-study for distributed project management practice, with the intention of transferring CERN’s Web experience across to industry.

Over the years the number of Finnish engineers and students passing through CERN to work on TuoviWDM steadily increased as the project evolved. Take-up at CERN was slow at first, but, when it became apparent that several underlying data management packages were being used – the LHC experiments, for example, do not use the same packages as the accelerator teams – the need for a single platform-independent interface became clear and TuoviWDM fitted the bill. The next question to be asked was how to ensure long-term support for a system that had been designed and built by a small in-house team.

The solution came at the end of 1996 in the form of an agreement between CERN and the Helsinki Institute of Physics (HIP), which has responsibility for Finland’s relationship with CERN. Under this agreement, HIP would finance future software development while CERN would continue to provide the necessary infrastructure and support. CERN was also granted an irrevocable, non-exclusive and permanent licence to use TuoviWDM free of charge. “The agreement gives CERN extensive benefits,” explained Dr Hameri, “in return for a modest contribution in terms of infrastructure support and a testbed for the technology.” However, the agreement left the question of long-term support open. Moreover, CERN was not the only body needing such support – companies involved in a TuoviWDM pilot project were also asking for the product to be put on a more solid footing, and so the idea of launching a commercial company was hatched.

At first, TuoviWDM provided a Web-based interface to all documentation related to a particular project. By 1998 this had been deployed in many particle physics research centres around Europe and was being used by about 12 000 people. It was also in 1998 that some of the original HUT people who had worked on the project at CERN started up Single Source Oy to support the software.

cernfinn2_4-01

Meanwhile, development was still under way at CERN, and the fledgling firm worked hand in hand with the lab to add features that would be invaluable to the LHC project and marketable by the company anywhere where large teams of people had to be managed. It was during this period that TuoviWDM evolved into the commercial product Kronodoc, which not only manages documents – keeping track of authorship and cataloguing modifications – but also provides a powerful management tool by tracking the use of documents.

Kronodoc allows project managers to see who is accessing documents and how they are using them. It distinguishes between viewing and downloading, which roughly equates to the difference between using a document and working on it. The software also builds self-organizing maps that show, at a glance, groups of closely collaborating individuals, as well as isolated groups that have little or no contact. In any large project it is natural for working partnerships to evolve, and for some groups to work closely together at one point in the project’s life and not at another. Engineers, for example, may work more closely with draughtsmen at the beginning of a project than they do as the project evolves. By revealing these working relationships, Kronodoc allows project managers to take the pulse of the project at any moment and then to make sure that all of the necessary working relationships have been put in place.

Today, Single Source Oy is a successful company, the customers of which include a leading manufacturer of both diesel power plants and marine diesel engines, the Wärtsilä corporation. In the view of Ari-Pekka Hameri, who is still at CERN, this success would not have been possible without the close collaboration between CERN, the Finnish institutions and industry. Over the lifetime of the project, some 38 people funded from Finland worked at CERN, collaborating closely with the laboratory’s personnel and making full use of their expertise. TuoviWDM produced 16 master’s theses and contributed to two doctorates, as well as training 18 students on summer placement programmes. These figures alone represent a significant transfer of technology through people, given that 80% of these students have so far found jobs in industry. According to Dr Hameri, “This flexible exchange of students and researchers, which could be coordinated to the changing needs of the development work, is a unique and highly positive feature of research institutes like CERN.”

Turning inventions into companies

In Finland an invention is the property of its inventor, not of the institution where s/he works. Moreover, the country encourages institutions to support inventors who wish to turn their ideas into companies. “The recent success of Finnish high-technology industry is at least partly due to this type of supportive environment,” said Dr Hameri, who intended to apply a similar approach to TuoviWDM. CERN’s technology-transfer policy, while not identical to Finland’s, allowed him to do so. CERN holds the intellectual property rights to the inventions of its personnel, but the lab’s policy is to publish all of its results, making them available to industry. This allowed members of the TuoviWDM team to take the ideas that they had developed at CERN and seek venture capital to establish a company.

With agreements between CERN, HIP and Single Source Oy guaranteeing the transfer of technology to the new company, Single Source Oy secured the funding that it needed in 2000 and the company now employs some 21 people, 14 of whom have worked on TuoviWDM at CERN. For its part, CERN has the long-term support that it needs, and one of its member states has a tangible return on its investment in basic science.

End-cap toroids for ATLAS experiment are ready to roll

cernnews8_3-01

The first part of the end-cap toroid system for the mighty ATLAS experiment, which is under preparation for CERN’s Large Hadron Collider (LHC), will soon be on the road to CERN. The experiment’s two end-cap toroids, each weighing 340 tonnes, will be installed at the outer ends of the ATLAS detector and used to determine precisely the momenta of highly energetic muons emerging from the LHC’s proton-proton collisions.

The main components of the end-cap toroids are being built by Dutch companies under the guidance of NIKHEF in the Netherlands and the Rutherford Appleton Laboratory in the UK, which has design responsibility for the complete end-cap toroid system. Their construction forms an in-kind contribution to ATLAS from NIKHEF, amounting to roughly Ý5 million of the Ý8 million manufacturing cost. Smaller contributions to the toroids will come from several other countries in the ATLAS collaboration.

Each of the toroids consists of eight superconducting coils inside an insulating vacuum vessel that is 10.7 m in diameter and 5 m wide. The resulting magnetic field has circular field lines perpendicular to the beams and will deflect the muons in a plane defined by their track and the beam line, allowing much more precise momentum determination than with the inner detector alone.

The first part to be shipped to CERN will be a vacuum vessel, which is scheduled to leave the Netherlands in June. Owing to its size and weight (about 80 tonnes), the move will not be trivial. Split into two halves, the vessel will be shipped from the Netherlands to Strasbourg via the Rhine, and on to Geneva by road.

Manufactured by Schelde-Exotech in Vlissingen, the vessel was machined to a precision better than 1 mm at Machine Fabriek Amersfoort in Ijsselstein (near Utrecht) before being taken back to Vlissingen, where the two halves are now being combined. Once the vessel has been assembled, its mechanical construction and vacuum tightness will be tested, after which it will be split again for transport to CERN.

In the meantime another firm, HMA Power Systems in Ridderkerk, has started production of the superconducting coils and their support structures. The eight coils in each vessel will be tightly fixed between eight aluminium keystone boxes, which will keep the coils in place when the field is switched on, exerting radial forces of up to 550 tonnes per coil. Thermal insulation will be provided by the vacuum, a radiation heat shield made by Hatehof in Israel and multilayer superinsulation blankets made by Austrian Aerospace.

To complete the cosmopolitan whole, the 26 km of conductor for the toroids is being supplied as in-kind contributions from Italy (Europa Metalli), Switzerland (ETH/Nexans) and Germany (Vacuum Schmelze). All components of the end-cap toroids are scheduled to be at CERN by the end of 2002 for integration, surface testing and installation underground in the ATLAS experimental hall.

NIKOS makes coronary tests comfortable

cernnews9_3-01

A new method for the radiography of coronary blood vessels (known as angiography) promises to make examinations much easier for patients. The NIKOS intravenous angiography technique, which produces an X-ray image of coronary arteries, was developed by the DESY laboratory, Hamburg, in collaboration with doctors from the University Hospital Hamburg-Eppendorf and the Bevensen Heart Centre, and physicists from the University of Siegen. DESY has examined a total of 379 patients from all over Germany and abroad with extremely satisfactory results.

With the successful conclusion of these trials at DESY’s HASYLAB synchrotron radiation centre, a door opens for routine application of the new technique. However, this would have to be at a specially equipped clinic, with a compact source of monochromatic X-rays. An initial design for such a source, based on a storage ring, has already been made at DESY.

The coronary arteries surround the heart and supply it with blood. If they become constricted, a heart attack can result. To look for these life-threatening constrictions (stenoses), doctors normally insert a long catheter into the coronary vessels via the groin and the aorta. They then inject a contrast medium containing iodine through the catheter and make an X-ray. Such invasive examinations can be an unpleasant experience for patients.

The NIKOS technique eliminates the need for surgical procedures. Instead, the iodine is injected intravenously. Greatly diluted on its journey through the circulatory system, the iodine concentration is so low by the time it reaches the coronary arteries that conventional X-ray tubes cannot produce a clear image. The HASYLAB scientists use intense monochromatic X-rays from the DORIS electron ring as well as a special “two-colour” method to reveal the coronary arteries.

cernnews10_3-01

Of the 379 patients examined at DESY using the new technique, 60 underwent a subsequent diagnosis based on conventional X-ray exposure. The diagnoses displayed good agreement.

There are other non-invasive and minimally invasive procedures for imaging the coronary vessels – magnetic resonance imaging (MRI) and electron beam computed tomography (EBCT). However, compared with these methods, the NIKOS technique claims to provide the best image quality. Its output resolution is also better and, unlike MRI, metallic implants do not degrade image quality. However, none of these methods will be capable of replacing conventional coronary angiography in the long term. This is because the conventional method also allows for surgery during the examination, such as repair (angioplasty) or tube implantation (stent).

NIKOS allows the imaging of bypasses and stents in check-ups and postoperative examinations. Further improvements to the system and its associated techniques could increase image quality and therefore the value of the diagnosis.

PCs gain greater importance in particle accelerator control

cernnews11_3-01

Personal computers are steadily making inroads into some specialist and very impersonal fields. One example is particle accelerators, and the impact of PCs was described in the Third International Workshop on PCs and Particle Accelerator Controls (PCaPAC), which was held recently at DESY.

From its inception in 1996, PCaPAC has specifically targeted the use of PCs in accelerator controls and has shown itself to be a valuable workshop in giving participants a chance to exchange ideas and experience in PC-related technologies, where trends can change rapidly. Participation in PCaPAC 2000 reached an all-time high of 93 contributions and 127 registered attendees from 43 different institutes and 17 countries.

At PCaPAC 2000, many running accelerators, the control-system infrastructure of which was based either entirely or in part on PCs, were presented. Among these were small systems built and maintained by a few people (e.g. the storage rings ASTRID and ELISA of the ISA Storage Ring Facilities at University of Aarhus in Denmark), medium-scale systems (e.g. the ANKA synchrotron light source in Karlsruhe, and accelereators at KEK in Japan) and the, probably, largest PC-based control systems for the HERA, PETRA and DORIS storage rings and their injectors at DESY in Hamburg.

Industrial solutions were also presented, covering complete packages, such as Supervisory Controls and Data Acquisition (SCADA) systems, as well as control systems based on Common Object Request Broker Architecture (CORBA) or Distributed Component Object Model (DCOM). In this vein, a joint venture between KEK and IT -Industry was presented, where a new Component Oriented Accelerator Control Kernel (COACK) was demonstrated.

In several cases, strategies to convert from legacy systems to modern ones and/or to integrate different platforms were presented. The distributed nature of PC control systems is manifest in the important role that is played by system administration. Also discussed were the needs and wishes of the accelerator operators regarding the control system as well as different approaches to supplying the optimal console profile to different and roaming users.

There were three special “tutorials”. First, a representative from CISCO described networking trends in the next three years of campus networks. Another covered “SCADA – current state and perspective”, when participants could get a real feel for both SCADA systems and trends in the field. Finally, as interest in such modern innovations as Java and CORBA remains high, whereas the number and variety of associated buzzwords make these subjects daunting for the uninitiated, a tutorial on these topics was also included.

While there is significant overlap in topics with the much larger ICALEPCS conference (see Computer control of physics is increasing), PCaPAC has nonetheless found its niche as a biennial workshop, alternating with ICAPLEPCS. The pace at which computer hardware and software as well as the Internet evolve is fast, so an event such as PCaPAC, where topics, trends and problems can be discussed in a workshop atmosphere, has been seen to be not only worthwhile but enthusiastically accepted by the controls community. For instance, in the category of Future Trends and Technologies, participants saw their first glimpse of data exchange via SOAP (Simple Object Access Protocol) and XML (Extensible Markup Language).

The Fourth International Workshop on PCs and Particle Accelerator Controls will be held in the autumn of 2002, in Asia or Italy.

Computing technology sits in the driving seat

cernfermi1_3-01

Fermilab director Mike Witherell, welcoming nearly 200 participants from around the world to the 7th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2000), said: “We have wonderful opportunities awaiting particle physics over the next decade. Two technologies are widely recognized as having driven our field from the beginning – accelerators and particle detectors. But there is also growing recognition that we rely on developments in advanced computing technologies. Innovative scientists often recognize the need for a revolutionary development before the wider world understands what it is good for. Once something becomes available, of course, lots of people know what to do with it, as we have learned over the last decade with the World Wide Web. There is a mutual benefit in collaboration between forefront physics research and computing technology. We rely, all over our laboratory (and our community), on continued innovations in the areas being discussed here at this conference.”

Short history

cernfermi2_3-01

Reflecting the short history of these techniques, the first workshop in the series was held only 11 years ago in Lyon, France, under the name Artificial Intelligence in High Energy and Nuclear Physics, and was organized by Denis Perret-Gallix (LAPP, Annecy). Following this, the workshop was held in Europe every 18 months or so.

The 7th international workshop was the first to be held in the US with the updated name and with expanded scope. It followed four main tracks: artificial intelligence (neural networks and other multivariate analysis methods); innovative software algorithms and tools; symbolic problem solving; and very large-scale computing. It also covered applications in high-energy physics, astrophysics, accelerator physics and nuclear physics.

Besides the plenary, parallel and poster sessions, the workshop included working group and panel discussion sessions focusing on particular topics – uses of C++, large-scale simulations, advanced analysis environments and global computing – which allowed informal presentations and extensive discussion sessions.

The keynote talk, entitled “Information technology: transforming our society and our lives”, was given by Ruzena Bajcsy of the US National Science Foundation. John Moody, a former particle theorist and now professor of computer science and director of the Computer Finance Program at Oregon Graduate Institute, spoke on “Knowledge discovery through machine learning”. Gaston Gonnet from ETH, Institute for Scientific Computation, Zurich, Switzerland, talked about the “Computer algebra system”.

cernfermi3_3-01

A big attraction early in the workshop was C++ inventor and world-renowned computer scientist Bjarne Stroustrup from AT&T Bell Labs. He gave a featured talk entitled “Speaking C++ as a native” and served as a distinguished panellist in discussions on the “Use of C++ in scientific computing”. His talk explained, by way of several simple but striking examples, how C++ can be used in a much more expressive manner than one commonly finds. Stroustrup, echoing the comments of Mike Witherell, noted that the world is slow to catch on to new ideas. He also emphasized the need for physicists to be involved in the C++ Standards Committees if they wish to influence the further development of that language.

cernfermi4_3-01

Another distinguished participant and speaker was Stephen Wolfram, creator of the Mathematica software packages and the winner of a McArthur Foundation Fellowship award in 1981 at the age of 22. Early in his career he worked in high-energy physics, cosmology and quantum field theory. For the last couple of decades he has been developing a general theory of complexity.

Wolfram gave a special colloquium describing his perspective on the development of Mathematica and the establishment of Wolfram Research. His talk gave glimpses of his work on “A new kind of science,” which has occupied his attention during the past nine years. Stephen Wolfram has been working on cellular automata and the evolution of complex systems, and he is writing an epic volume (of about 1000 pages) on the subject, which is soon to be published.

New experiments

High-energy physics experiments and analyses took centre stage halfway through the workshop with plenary talks on “Advanced analysis techniques in HEP” by Pushpa Bhat (Fermilab), “Statistical techniques in HEP” by Louis Lyons (Oxford) and “The H1 neural network trigger project” by Chris Kiesling (MPI). These were followed by “Theoretical computations in electroweak theory” by Georg Weiglein (CERN).

There were vigorous and stimulating discussions in a panel session on Advanced Analysis Environments, with perspectives presented by Rene Brun (CERN), Tony Johnson(SLAC) and Lassi Tuura (CERN and Northeastern).

Fermilab is facing the collider Run II (which began in March) with upgraded CDF and D0 detectors. The advanced computing and analysis techniques discussed at this workshop may be crucial for making major discoveries at the Tevatron experiments.

The new generation of experiments under construction in particle physics, cosmology and astrophysics – CMS and ATLAS at CERN’s LHC collider, the Laser Interferometer Gravitational Observatory (LIGO) and the Sloan Digital Sky Survey (SDSS) – will usher in the most comprehensive programme of study ever attempted of the four fundamental forces of nature and the structure of the universe.

The LHC experiments will probe the tera-electronvolt frontier of particle energies to search for new phenomena and improve our understanding of the nature of mass. LIGO hopes eventually to detect and analyse gravitational waves arising from some of nature’s most energetic events. SDSS will survey a large fraction of the sky and provide the most comprehensive catalogue of astronomical data ever recorded.

Together, these investigations will involve thousands of scientists from around the world. Mining the scientific treasures from these experiments, over national and intercontinental distances, over the next decade or two, will present new problems in data access, processing and distribution, and remote collaboration on a scale never before encountered in the history of science.

Thus “grid computing” is emerging as one key component of the infrastructure that will connect multiple regional and national computational centres, creating a universal source of pervasive and dependable computing power. Grid computing was therefore the focus for a whole day at the workshop. Various champions of the grid projects GriPhyN, Particle Physics Data Grid (PPDG) and European DataGrid contributed, such as Ian Foster (ANL), Paul Avery (Florida), Harvey Newman (Caltech), Miron Livny (Wisconsin), Luciano Barone (INFN) and Fabrizio Gagliardi (CERN), along with other pioneers of grid and worldwide computing.

In the sphere of very-large-scale computing and simulations, Robert Ryne (Los Alamos) spoke on accelerator physics, Alex Szalay (John Hopkins) on astrophysics, Paul Mackenzie (Fermilab) on lattice calculations and Aiichi Nakano (LSU) on molecular dynamics simulations. A working group on large-scale simulations coordinated by Rajendran Raja (Fermilab) and Rob Rosner (Chicago) featured contributions from particle experiments CDF, D0, CMS and ATLAS, as well as from the muon collider and astrophysics communities.

Technology show

A major event at the workshop was a technology show coordinated by SGI representative Kathy Lawlor, Cisco representative Denis Carroll and Fermilab’s Ruth Pordes, Dane Skow and Betsy Schermerhorn.

The show featured the Reality Center for collaborative visualization, IP streaming video, IP Telephony, wireless LAN by SGI and Cisco, and hardware and application software exhibits from Wolfram Research, Platform Computing, Objectivity, Kuck & Associates Inc and Waterloo Maple.

The meeting was organized and co-chaired by Pushpalatha Bhat of Fermilab, who for more than a decade has been a strong advocate of the use of advanced multivariate analysis methods in high-energy physics, and by Matthias Kasemann, head of Fermilab’s Computing Division.

The workshop was sponsored by Fermilab, the US Department of Energy and the US National Science Foundation; it was co-sponsored by Silicon Graphics and Cisco Systems; and it was endorsed by the American and European Physical Societies.

 

Physics aids new medical techniques

cernpet1_3-01

These days, innovation is flourishing in every industry. The variety of ideas currently being generated was illustrated last year by the news magazine Time, when it selected three specialist areas – consumer technology, medical science, and basic industry – in which to put new developments to the vote as “Inventions of the Year”.

The award-winning invention in the medical science category was a scanner that combined the advantages of computer tomography with positron emission tomography (PET). The use of these techniques, which depend on detecting and analysing electromagnetic radiation (X-rays or gamma rays respectively), show that detection techniques from particle physics have made, and continue to make, essential contributions to medical science.

Tracers and tomography

Soon after their discovery by Roentgen in 1895, X-rays were being used for monitoring bones, teeth and other dense organic matter, thereby revolutionizing medical diagnostics and introducing a new science – radiography.

Then came nuclear medicine. George de Hevesy was awarded the 1943 Nobel Prize for Chemistry for his invention of radioactive tracers, in which small doses of radioactive material are administered to patients to follow the metabolic functioning of organs such as the kidneys or thyroid gland. The impact of the technique was so great that the supply of suitable radioactive isotopes went on to become an industry in its own right.

Although they provided valuable new information, these techniques, like conventional X-ray photographs, could only reveal a two-dimensional image of a three-dimensional body, and interpretation could therefore be difficult.

The imaging capabilities of X-rays were dramatically boosted by the 1972 invention of the computer-assisted tomography (CT) scanner, in which a fan-like beam of X-rays rotates round a patient, providing a two dimensional picture of a “slice” of their body. A complete three-dimensional image can be built up by scanning the body slice by slice. Tomography can be combined with nuclear medicine, for example in single-photon emission computed tomography (SPECT), which maps the internal distribution of the tracer.

The birth of PET

cernpet2_3-01

Many artificial isotopes emit positrons, the antiparticles of electrons. In the early 1950s it was discovered that these isotopes offered new possibilities for nuclear medicine.

A positron, once it has been produced, is quickly snapped up by a neighbouring matter particle – usually an electron. This annihilation of the positron and the electron produces a characteristic fingerprint – two 511 keV photons (gamma rays) shooting out in exactly opposite directions. By picking up these pairs it is possible to pinpoint where the positron annihilations occurred. The new science of PET was born, in which the annihilation signals track a patient’s metabolism, revealing, for example, the way in which the brain reacts to stimuli.

Since its inception, PET technology has profited from new developments in radiation detection, first using sodium iodide crystals, then using materials such as bismuth germanate (BGO), which offered better performance, and more recently lutetium oxyorthosilicate, which is faster and gives more light output than BGO.

Specialists from particle physics have also applied new high-speed computing solutions (transputers) to speed up the imaging process.

Dedication’s what you need

cernpet3_3-01

Developing these techniques is time-consuming and requires a great deal of motivation as well as resources. In a pure science laboratory like CERN, which focuses on major experiments in particle physics, such spin-off research and development work has sometimes had to take a back seat, and several enthusiasts have left for new pastures.

Among them is physicist David Townsend, who worked at CERN in 1970-1978 before concentrating on PET developments in the US and Europe and finally transferring to the University of Pittsburgh. He is one of the masterminds behind the development honoured by Time. The other is engineer Ronald Nutt, senior vice president and director of research and development at CTI PET Systems – the leading supplier of PET technology – in Knoxville, Tennessee.

Another physicist, Alan Jeavons, who worked at CERN with Townsend in the 1980s to develop the high-density avalanche chamber (HIDAC) PET camera, left physics research to form his own company, Oxford Positron Systems, and he has landed his first major contract to supply advanced PET systems.

Data Grid project gets EU funding

cernnews1_2-01

Plans for the next generation of network-based information-handling systems took a major step forward when the European Union’s Fifth Framework Information Society Technologies programme concluded negotiations to fund the Data Grid research and development project. The project was submitted to the EU by a consortium of 21 bodies involved in a variety of sciences, from high-energy physics to Earth observation and biology, as well as computer sciences and industry. CERN is the leading and coordinating partner in the project.

Starting from this year, the Data Grid project will receive in excess of Ý9.8 million for three years to develop middleware (software) to deploy applications on widely distributed computing systems. In addition to receiving EU support, the enterprise is being substantially underwritten by funding agencies from a number of CERN’s member states. Due to the large volume of data that it will produce, CERN’s LHC collider will be an important component of the Data Grid (see The grid is set to grapple with large computations).

As far as CERN is concerned, this programme of work will integrate well into the computing testbed activity already planned for the LHC. Indeed, the model for the distributed computing architecture that Data Grid will implement is largely based on the results of the MONARC (Models of Networked Analysis at Regional Centres for LHC experiments) project. CERN’s part in the Data Grid project will be integrated into its ongoing programme of work and will be jointly staffed by EU- and CERN-funded personnel.

The work that the project will involve has been divided into numbered subsections, or “work packages”. CERN’s main contribution will be to three of these work packages: WP 2, dedicated to data management and data replication; WP 4, which will look at computing fabric management; and WP 8, which will deal with high-energy physics applications. Most of the resources for WP 8 will come from the four major LHC experimental collaborations: ATLAS, CMS, ALICE and LHCb.

Other work will cover areas such as workload management (coordinated by the INFN in Italy), monitoring and mass storage (coordinated in the UK by the PPARC funding authority and the UK Rutherford Appleton Laboratory) and testbed and networking (coordinated in France by IN2P3 and the CNRS). CERN is also contributing to the work on testbeds and networking, and it is responsible for the overall management and administration of the project with resources partially funded by the EU.

The data management work package will develop and demonstrate the necessary middleware to ensure secure access to petabyte databases, enabling the efficient movement of data between Grid sites with caching and replication of data. Strategies will be developed for optimizing and costing queries on the data, including the effect of dynamic usage patterns. A generic interface to various mass storage management systems in use at different Grid sites will also be provided.

The objective of the fabric management work package is to develop new automated system management techniques. This will enable the deployment of very large computing fabrics constructed from tens of thousands of mass-market components, with reduced systems administration and operations costs. All aspects of management will be covered, from system installation and configuration through monitoring, alarms and troubleshooting.

WP 8 aims to deploy and run distributed simulation, reconstruction and analysis programs using Grid technology. This package is central to the project because it is among those that enable the large-scale testing of the middleware being developed by the other work package groups and it provides the user requirements that drive the definition of the architecture of the project.

Dozens of physicists, mostly from Europe, will participate in the endeavour while continuing to perform their day-to-day research activities.

A project architecture task force has recently been appointed, with participants from the relevant middleware work packages and a representative from the applications. Leading US computer scientists are also participating in this effort to ensure that developments in the US continue in parallel with work being carried out in Europe. Data Grid is hosting the first Global Grid Forum in Amsterdam in March, which will aim to coordinate Grid activity on a worldwide scale.

Moscow accelerator creates first beam

cernnews5_2-01

The TeraWatt Accumulator (TWAC) project at Moscow’s Institute for Theoretical and Experimental Physics (ITEP) has successfully passed its proof-of-principle test. A bunch of carbon-4+ ions from near the laser ion source were pre-accelerated in the accelerator/ accumulator facility’s new U-3 pre-injector, injected and accelerated in the UK booster ring to 300 MeV per nucleon, stripped down to 6+ and stacked into the U-10 storage ring.

This marks the completion and commissioning of the new facility’s main systems – ion source, ion pre-injector, radiofrequency and power supply for the booster ring, beam transport lines and pulsed magnetic elements.

The essence of ITEP’s TWAC project is to upgrade and modify the ITEP accelerator complex so that it will have a new unique capability for investigating the following fields:

  • extreme states of matter with high density and temperature, and their relation to the physics stellar interiors;
  • basic research into the properties of the nuclear matter (relativistic nuclear physics);
  • medicine and radiobiology for tumour therapy using carbon ions.

The project takes advantage of a heavy-ion accelerator facility based on two existing synchrotron rings, and it uses a special stripping technique for stacking pulses accelerated in the UK booster into the U-10 storage ring.

For this first phase the ion source, based on a 5J/0.5 Hz TEA CO2 laser, has been operated and installed in the U 3 pre-injector area. The 20 mA/20 µs carbon ion beam was matched to the 2 MV/2.5 MHz pre-injector.

The accelerated 16 MeV carbon-4+ ion beam was guided by the new beam transport line to the UK ring and injected. The intensity measured at the injection point is around 1.5 x 1010/15 µs. The carbon beam is then circulated in the UK ring at constant field.

The power supply of the UK booster ring magnets, of the vacuum system and of the radiofrequency accelerating cavities has been upgraded and the carbon beam accelerated to 300 MeV per nucleon.

Magnetic components of the beam-transfer line connecting the UK and U-10 rings required for the multi-turn injection scheme have been manufactured, installed and adjusted.

For the second phase of the project, the beamline for extraction to the beam-target interaction area will be designed and constructed this year. Focusing elements and the interaction vacuum chamber will be manufactured and installed in the experimental area.

Special attention will be paid to research and development for modern and sophisticated diagnostics for measurements of dense plasma parameters under unique conditions.

Two new beam transport lines and related slow extraction systems will be designed for beam delivery to the medical and nuclear physics experimental areas. The application of electron cooling for increasing the phase space density of accumulated beam will be investigated and the design of the new linac-injector will be completed.

During the third phase (January 2002 – December 2003), experimental facilities for medical physics and for relativistic nuclear physics will be commissioned. A powerful CO2 laser with 100 J/20 ns output at 1 Hz will be set in operation.

Together with the upgrading of the main accelerator-accumulator systems and with implementation of the pulse compression system, the intensity of the heavy-ion beam will then reach the maximal (target) values:

  • in ion acceleration mode, supplying up to 4.3 GeV/nucleon and up to 1011 particles/s;
  • in ion accumulation mode, 300-700 MeV/nucleon and10121013 particles per 100 ns (approximately) pulse;
  • in medical application mode, some 250 MeV/nucleon, 109-1010 particles/s.

HERA-B finds a new direction

cernnews6_2-01

The DESY laboratory in Hamburg has accepted a proposal from the HERA-B experiment for a revised programme of research. This follows the recent commissioning of the B-meson factories at SLAC, Stanford and KEK in Japan, and an earlier recommendation by DESY’s Extended Scientific Council to bring HERA-B to an orderly conclusion in the near future.

The HERA-B experiment was approved in 1995 as a dedicated CP-violation experiment. At the time, the only measurement of this phenomenon, which gives a handle on why nature apparently prefers matter to antimatter, came from experiments on kaon decays at CERN and Fermilab. B-mesons (containing the fifth, beauty or “b” quark) were considered to be richer ground for probing CP-violation, and at DESY a copious source of B-mesons could be provided by a wire target in the halo of the proton beam of the HERA electron-proton collider.

Two “B-factories” were also in preparation in the mid-1990s at SLAC and KEK – each of them single-experiment projects based on novel asymmetric electron-positron colliders. While the B-factories were challenging on the accelerator front, HERA-B was faced with the formidable task of finding its signal amid a background some 12 orders of magnitude as large.

Detecting this signal required a detector of unprecedented radiation hardness that was capable of handling equally unprecedented amounts of data. The ensuing search for radiation-hard technologies has led to important advances in tracking-detector technology, but it has also resulted in substantial delays to the physics programme.

Both B-factories started collecting data in 1999, leading to first results being presented in the summer of 2000. HERA-B, however, had not achieved the required level of sensitivity by the end of HERA’s run in August 2000. In response to the recommendation to conclude the HERA-B programme, the collaboration drew up a two-year plan of alternative research that exploits the unique features of the HERA-B spectrometer and trigger systems, and this was approved in December.

The new programme addresses open questions in strong interaction physics and rare decays of charm quarks. The first results are expected in 2002, when a decision will be taken on continuing, possibly with an enlarged programme to include some of the original goals of the experiment.

bright-rec iop pub iop-science physcis connect