Comsol -leaderboard other pages

Topics

Physics in the Italian Alps

Now in its 19th year, the Rencontres de Physique de la Vallée d’Aoste is known for being a vibrant winter conference, where presentations of new results and in-depth discussions are interlaced with time for skiing. Taking place in La Thuile, a village on the Italian side of Mont Blanc, it consistently attracts a balanced mix of young researchers and seasoned regulars from both theoretical and experimental high-energy physics. The 2005 meeting, which took place from 27 February to 5 March, was no exception.

CCEalp1_05-05

As well as the standard sessions on particle physics, cosmology and astrophysics typical for such a conference, the organizers always try to include a round-table session on a topical subject, as well as a session on a wider-interest topic that tackles the impact of science on society. This year, the first of these sessions was Physics and the Feasibility of High-Intensity, Medium-Energy Accelerators, and the second was The Energy Problem.

Dark energy, WIMPs and cannon balls

An increasing number of experiments are trying to answer questions in high-energy physics by taking to the skies, making the distinction between particle physics and astronomy more fuzzy. The first session of the conference presented an impressive array of experiments and results, ranging from gravitational-wave detection to gamma-ray astronomy. The team working on the Laser Interferometer Gravitational-Wave Observatory (LIGO), with two fully functioning antennas 3000 km apart, now understands the systematics and has begun the fourth period of data-taking with improved sensitivity.

In gamma-ray astronomy, ground-based detectors – which detect the Cherenkov light emitted when gamma-ray-induced particle showers traverse the atmosphere – are constantly improving. The High Energy Stereoscopic System (HESS) in Namibia became fully operational in 2004 with a threshold of 100 GeV, while new detectors with thresholds as low as 20 GeV are in the pipeline. Satellite-based gamma-ray detectors have also provided some excitement, with the Energetic Gamma Ray Experiment Telescope (EGRET) observing an excess of diffuse gamma rays above 1 GeV, uniformly distributed over all directions in the sky.

CCEalp2_05-05

This excess could be interpreted as due to the annihilation of neutralinos. The neutralino is the supersymmetric candidate of choice as a weakly interacting massive particle (WIMP) – a popular option for the dark matter of the universe. This prompted Dmitri Kajakov of the Institute for Theoretical and Experimental Physics (ITEP), Moscow, to state that “dark matter is the supersymmetric partner of the cosmic microwave background”, since neutralinos can be thought of as spin-½ photons.

The Gamma-Ray Large Area Space Telescope (GLAST) satellite, launching in 2007, will offer an important improvement in gamma-ray astronomy, with sensitivity to 10,000 gamma-ray sources compared with EGRET’s 200.

The DAMA/NaI collaboration raised some eyebrows. It reported an annual modulation of 6.3σ significance in data observed over seven years in its nuclear-recoil experiment at the Gran Sasso National Laboratory, which stopped taking data in 2002. This modulation could be interpreted as due to a WIMP component in the galactic halo, which is seen from Earth as a “wind” with different speeds, depending on the annual cycle. The collaboration’s study of possible backgrounds has not identified any process that could mimic such a signal, but other experiments have not observed a similar effect. The new set-up, DAMA/LIBRA, which is more than twice as big and started taking data in 2003, might shed some light.

Another way of looking for WIMPs is through their annihilations that produce antimatter. Antimatter in the universe is not produced in large quantities in standard processes, therefore any excess of antimatter seen would be exciting news for WIMP searchers. The Payload for Antimatter Matter Exploration and Light-Nuclei Astrophysics (PAMELA) satellite due to be launched later this year will provide valuable data on antiproton and positron spectra.

Alvaro De Rújula of CERN, using traditional (and increasingly rare) coloured transparencies written by hand, gave an account of his theory of gamma-ray bursts (GRBs), which has now developed into a theory of cosmic rays. Central to the theory are the cosmic “cannon balls”, objects ejected from supernovae with a density of one particle per cubic centimetre, and with a mass similar to that of the planet Mercury but a radius similar to that of the orbit of Mars. These cannon balls, moving through the interstellar medium at high speeds (with initial γ factors of the order of 1000), not only explain GRBs and their afterglows in a simple way, but also explain all features of cosmic-ray spectra and composition, at least semi-quantitatively, without the need to resort to fanciful new physics. What the theory does not attempt to explain, however, is how cannon balls are accelerated in the first place.

Dark energy was reviewed by Antonio Masiero of the University of Padova. Masiero pointed out that theories that do not associate dark energy with the cosmological constant do exist. One can assume, for instance, that general relativity does not hold over very long distances, or that there is some dynamical explanation, like an evolving scalar field that has not yet reached its state of minimum energy (known as a quintessence scalar field), or even that dark energy is tracking neutrinos. With the latter assumption, he came to the interesting conclusion that the mass of the neutrinos depends on their density, and therefore that neutrino mass changes with time. The cosmological constant or vacuum-energy approach, however, offers the less exotic explanation of dark energy.

Finally, Andreas Eckart of the University of Cologne reviewed our knowledge of black holes, with emphasis on the massive black hole at the centre of our own galaxy, Sagittarius A*. He played an impressive time sequence of observations taken over 10 years of the vicinity of this black hole, showing star orbits curving around it.

The golden age of neutrino experiments

The neutrino session began with Guido Altarelli of CERN, who reviewed the subject in some depth. Although impressive progress has been made during the past decade, there are unmeasured parameters that the new generation of experiments must address. The Antarctic Muon and Neutrino Detector Array (AMANDA), which uses the clean ice of the South Pole for neutrino detection, reported no signal from its search for neutrino point-sources in the sky, but the collaboration is already excited about its sequel, IceCube.

The Sudbury Neutrino Observatory (SNO) collaboration has added salt to its apparatus, to increase the detection efficiency by nearly a factor of three compared with the earlier runs. Analysis yields slightly smaller errors on Δm13 than K2K (KEK to Kamiokande), the long-baseline experiment in Japan, which reported on the end of data-taking. K2K is now handing over to the Main Injector Neutrino Oscillation Search (MINOS) in the US, which had recorded the first events in its near detector just in time for the conference. MINOS is similar in conception to K2K, but has a magnetic field in its fiducial volume – the first time in such an underground detector – and it will need three years of data-taking to provide competitive results.

The director of the Gran Sasso National Laboratory, Eugenio Coccia, gave a status report of the activities of the laboratory, which is undergoing an important safety and infrastructure upgrade following a chemical leak. The laboratory is the host of a multitude of experiments on neutrino and dark-matter physics. These include the Imaging Cosmic And Rare Underground Signals (ICARUS) and Oscillation Project With Emulsion Tracking Apparatus (OPERA) experiments for the future CERN Neutrinos to Gran Sasso (CNGS) project and Borexino, which is the only experiment other than KamLAND in Japan that can measure low-energy solar neutrinos. The laboratory also houses neutrinoless double-beta-decay experiments.

Strong, weak and electroweak matters

In the session on quantum chromodynamics, Michael Danilov of ITEP had the unenviable task of reviewing the numerous experiments that have looked for pentaquarks. In recent years, there have been 17 reports of a pentaquark signal and 17 null results. Danilov justified his sceptical approach by pointing out various problems with the observed signals. The small width of the Θ+ is very unusual for strong decays. Moreover, this state has not been seen at the Large Electron Positron (LEP) collider, although this fact can be circumvented by assuming that the production cross-section falls with energy. However, the Belle experiment at KEK does not see the signal either, weakening the cross-section argument. The Θc is seen by the H1 experiment at HERA, but not by ZEUS or by the Collision Detector at Fermilab (CDF). Finally, many experiments have not seen the Ξ signal. Although Danilov thinks that the statistical significance of the reported signals has been overestimated, it is still too large to be a statistical fluctuation. The question will only be settled by high-statistics experiments coming soon.

Amarjit Soni of Brookhaven summarized our knowledge of charge-parity (CP) violation by emphasizing the success of the B-factories, the fact that the Cabibbo-Kobayashi-Maskawa paradigm is confirmed, and that we now know how to determine the unitarity triangle angles α and γ, as well as the previously known angle β.

The electroweak session began with a report on new results from LEP, with LEP showing no signs that it has said its final word yet. The running of αQED has been the subject of a new analysis of Bhabha events at LEP. The results from the OPAL experiment, recently submitted for publication, give the strongest direct evidence for the running of αQED ever achieved in a single experiment, with a significance above 5σ. Regarding the W mass, the combined data error for LEP now stands at 42 MeV, whereas at the Tevatron, Run II is being analysed and the error from CDF from 200 fb-1 of data (a third of the collected data) is already less than their published result for Run I. The Tevatron collaborations expect to achieve a 30-40 MeV error on the W mass with 2 fb-1 of data. The search is on for the Higgs particle at Fermilab with a new evaluation of the Tevatron’s reach. For a low-mass Standard Model Higgs, the integrated luminosity needed for discovery (5σ) is 8 fb-1; evidence (3σ) needs 3 fb-1, while exclusion up to 130 GeV needs 4 fb-1.

From high intensity to future physics

The round-table discussion on physics and the feasibility of high-intensity, medium-energy accelerators was chaired by Giorgio Chiarelli of the University of Pisa, and after a short introduction he asked the panel members for their views. Pantaleo Raimondi of Frascati gave an overview of B and f factories and Gino Isidori, also of Frascati, pointed to a series of interesting measurements that can be performed by a possible upgrade to the Double Annular Ring For Nice Experiments (DAFNE) set-up at Frascati, where the time schedule would be a key point.

Francesco Forti of Pisa discussed the possibility of a “super B-factory”. He noted that by 2009, 1 ab-1-worth of B-physics data will be available around the world, and to have a real impact any new machine would need to provide integrated luminosities of the order of 50 ab-1. Roland Garoby of CERN talked about a future high-intensity proton beam at CERN, where the need for a powerful proton driver, a necessary building block of future projects, has been identified. Finally, Franco Cervelli of Pisa reviewed the high-intensity frontier, including prospects for the physics of quantum chromodynamics, kaons, the muon dipole-moment and neutrinos. A lively debate followed.

In the interesting science and society session on alternative energy sources, Durante Borda of the Instituto Superior Tecnico of Lisbon gave a detailed account of ITER, the prototype nuclear-fusion reactor that is expected to be the first of its kind to generate more energy than it consumes. ITER is designed to fuse deuterium (obtained from water) with tritium obtained in situ from lithium bombarded with neutrons, thereby creating helium and releasing heat (in the form of neutrons) captured through heat exchangers. It is hoped that this ambitious project, with its many engineering challenges, will pave the way for commercial fusion-power plants.

This talk was followed by presentations on geothermal, solar, hydroelectric and wind energy, covering a wide spectrum of renewable energy resources. It was clear from the presentations that the problem of future energy production is complicated, and a clear winner has yet to emerge from these alternative energy sources.

In the session on physics beyond the Standard Model, Andrea Romanino of CERN did not make many friends among the community working towards the Large Hadron Collider (LHC) at CERN. He stated that “split supersymmetry” – a variation of supersymmetry (SUSY) that ignores the naturalness criterion – pushes the SUSY scale (and any SUSY particles) beyond reach of the LHC, although within reach of a future multi-tera-electron-volt collider.

Fabiola Gianotti of CERN appeared undeterred. She closed the session and the conference by giving a taste of the first data-taking period of the LHC to come. She reminded the audience that for Standard Model processes at least, one typical day at the LHC (at a luminosity of 1033) is equivalent to 10 years at previous machines.

• The conference series is organized by Giorgio Bellettini and Giorgio Chiarelli of the University of Pisa and Mario Greco of the University of Rome.

When research meets commerce

A dynamic technology-transfer department is only one of many factors determining how often and how successfully start-up companies will be created. This belief is not based on any solid evidence, but on a single example with, at best, anecdotal value. The example is from my own experience over the past three years in helping to create SpinX Technologies, a start-up company developing instruments for pharmaceutical research and clinical diagnostics.

The SpinX story started around Christmas 2001 in a branch of IKEA. Following a casual conversation with a stranger sharing his table at the cafeteria, Piero Zucchelli from CERN could not let go of one thought. Of the myriad technologies used or developed at CERN, there had to be some just waiting to be exploited outside of particle physics. Since my summer-student days when Piero had been my supervisor, we frequently worked together and I had become interested in business as well as in molecular biology. Over the next few weeks, we spent countless hours brainstorming how technology at CERN could be applied to biotechnology.

CCEvie1_05-05

Gradually, we zoomed in on a field where many advances had been the work of physicists: microfluidics. The idea behind microfluidic devices, often called lab-on-a-chip systems, is to perform biochemical experiments using sub-microlitre volumes rather than millilitre volumes. Liquids are manipulated by running them through channels no wider than a human hair laid out on a silicon or plastic substrate. The applications are mostly biochemical and range from food testing to drug discovery.

What struck us most is that all microfluidic devices were specialized for a specific type of biochemical experiment characterized by a well defined but fixed sequence of operations. There were none where different protocols could be “programmed” on the same chip by setting a series of valves. Before long, Piero had come up with a valve implementation using an infrared laser. With this Virtual Laser Valve, everything started to fall into place and we soon had all the elements for a programmable microfluidics platform.

At some point a friend at CERN suggested that, if we were serious about this idea and believed it had commercial value, we should take it to serious investors. He would make the introductions. With nothing but an idea, we crossed the doorstep at Index Ventures, a venture-capital fund managing €750 million. It took us half a year to build their interest. For us, this was a turning point. Leaving the comfort of stable, well paid positions in research, we had to devote all our time and energy to SpinX with no guarantee of success.

Our case illustrates where a technology-transfer department can make a very real contribution, but also where it is essentially powerless. Contacts with venture-capital funds, law firms, business consultants and so on are obviously helpful to any aspiring entrepreneur, and a technology-transfer department is ideally placed to build this kind of network. But the key factor in attracting high-quality capital is commitment. Francesco de Rubertis, a partner at Index Ventures, says, “We do not invest in technologies, but in the teams that can make them happen.” Venture capitalists expect the people behind an idea to be entirely devoted to their start-up. Anything less is a sure deal-breaker.

Throughout the process, the interaction with CERN in general and technology transfer at CERN in particular could not have been more productive. Both of us were offered leave from the moment we started to work full time until the moment the investors committed. During the due-diligence process, the technology-transfer department provided us in record time with a legal statement clarifying the ownership of the intellectual property.

We did convince Index Ventures of our business concept, SpinX was created and the experience has been extremely rewarding. Today we are a company of 10 people, we have identified the first application of our technology, we have built a working system for that application and we are talking with several pharmaceutical companies interested in evaluating the system.

Would SpinX exist if it hadn’t been for CERN? I doubt it. We may not have started from any specific technology at CERN, but we could not have done it without the experience built there. It was at CERN that we learned the importance and benefits for international, multi-disciplinary teams to “try the impossible”. With 10 people, SpinX has seven nationalities and six doctorates, with backgrounds ranging from physics and engineering to biochemistry and enzymology. Like particle detectors at CERN, the instrument we developed uses a host of off-the-shelf components from widely varying industries. Finally, there is the undeniable value of the CERN brand. Recently, following a conference presentation of the technology to a senior executive at Eli Lilly, he commented that if former CERN physicists could not get it to work, then nobody could!

Large Hadron Collider Phenomenology

by M Krämer and F J P Soler (eds), Institute of Physics Publishing. Hardback ISBN 0750309865, £75 ($125).

The Large Hadron Collider (LHC) is often described as the machine needed by the worldwide community of high-energy particle physics experimentalists and theorists to search for and, it is hoped, discover physics signals beyond those expected from the Standard Model of particle interactions. The general-purpose experiments now completing construction (ATLAS and CMS) are often described as huge facilities optimized for the search for the elusive Higgs boson, the one key element missing in the Standard Model. About three years from now, the whole community in our field will focus on new and, we hope, unexpected physics results. These will cover a wide range of topics, extending over all possible theoretical conjectures published to date, that are relevant to experiments at the scale of tera-electron-volts.

CCEboo1_05-05

Given the dearth of guidance from experiments (aside from the beautiful but maddening agreement of even the most precise measurements with the predictions of the Standard Model), the driving goal of all theoretical developments beyond this very same Standard Model is that of solving the many fundamental issues in particle physics, which today are also relevant to cosmology, a science that has become much more mature experimentally over the past 10 years or so.

This book presents a series of lectures attempting to cover LHC phenomenology in the broadest possible sense. They range on one side from the intricacies of scalar fields, string theory and extra dimensions, to the basics of detector physics, which for more than 10 years has guided R&D in our field. This has led to the optimized design of the huge and complex detectors needed to extract minute signals from the huge backgrounds, involving today’s exciting physics (electroweak gauge-boson production, quantum chromodynamic multi-jet events, and heavy flavour production). But the lectures also range from accelerator science to modern e-science (the birth of the computing Grid) and from the less well known intricacies of heavy-ion physics to those of forward physics, where diffractive and quasi-elastic phenomena dominate.

These lectures were meant for today’s young physicists, many of whom will surely be the driving force behind the physics analyses and publications of the LHC experiments over the coming years. They were delivered on the occasion of the 57th Scottish Universities Summer School in Physics in summer 2003, by a well balanced mix of experienced theorists (D Ross, K Ellis and J Ellis) and seasoned experimentalists (V Gibson, H Hoffmann, B Müller, M A Parker, A de Roeck, R Schmidt and T S Virdee).

The emphasis in most of the lectures was on giving a snapshot of the current status of understanding in theory, phenomenology and accelerator and detector performance. The inevitable fate of such snapshots is to become fairly quickly obsolete, given the huge ongoing development of both the hardware and the software (should one add the middleware?) needed to operate the experiments, to simulate their performance accurately, to analyse their data quickly but unerringly, and to give a fair chance to all participants on all continents to join in the fun in the summer of 2007. Another drawback of such attempts is that, unavoidably, certain topics are treated superficially. However, I believe upon reading large sections of the book that this is largely outweighed by the benefit, for the young and less young reader, of finding in one volume a really complete coverage of all aspects relevant to LHC physics with a sufficiently rich bibliography to pursue in-depth reading.

For example, the reader interested in the phenomenology of quantum chromodynamic (QCD) beyond its direct application to LHC physics is referred to the book QCD and Collider Physics (by K Ellis, J Stirling and B Webber, 1996), the reader interested in more in-depth studies of accelerator physics and technology is referred to the Handbook of Accelerator Physics and Engineering (by A Chao and M Tigner, 2002), and the reader interested in the design and optimization of the general-purpose ATLAS and CMS detectors is referred to “Experimental challenges in high luminosity collider physics” (by N Ellis and T S Virdee, Ann. Rev. Nucl. Part. Sci. 44 609, 1994) and to all the Technical Design Reports published from these experiments between 1996 and 2005.

In summary, this book is an excellent introduction to LHC physics for any person entering the field now, at a moment when a huge effort from the whole community is still ongoing to meet the difficult challenge of assembling the various jigsaws needed to observe the first proton-proton collisions at the tera-electron-volt scale in summer 2007.

The reader has to be aware though that, apart from the foundations of the Standard Model, of supersymmetric and string theories, and of particle interactions in matter, many of the details provided in the lectures to illustrate the wonderful and exciting potential of the LHC and its associated detectors are to be considered as examples only. These will most likely bear little resemblance to the results published in the final publications a few (or many) years from now. I believe that most experimentalists, who have devoted a large fraction of their professional lives to make the LHC dream come true, hope that reality at the tera-electron-volt scale is something quite different from what has been envisaged to date by our theory colleagues. It is indeed the fulfilment of such a hope that can give a new and much needed impetus to our field, thereby surely opening up rich and thrilling prospects for the generations of theorists and experimentalists to come.

First dipole descends to LHC

On 7 March the first of the superconducting dipole magnets for the Large Hadron Collider (LHC), under construction at CERN, was lowered into the accelerator tunnel.

CCEnew1_04-05

The 15 m-long dipoles, each weighing 35 t, are the most complex components of the machine. In total, 1232 dipoles will be lowered 50 m below the surface via a special oval shaft. They will then be taken through a transfer tunnel to their final destination in the LHC tunnel, carried by a specially designed vehicle travelling at 3 km per hour.

In addition to the dipole magnets, the LHC will be equipped with hundreds of smaller magnets. More than 1800 magnet assemblies will have to be installed. Once in position, the magnets will be connected to the cryogenic system to form a large string operating with superfluid helium, which will maintain the accelerator at a temperature of 1.9 K.

CCEnew2_04-05

The lowering of this first magnet into the tunnel coincided with another milestone: the delivery of half of the superconducting dipole magnets. The remaining 616 dipoles are due to arrive by autumn 2006. The construction of these superconducting magnets represents a huge challenge both for CERN and for European industry; for example, some 7000 km of niobium-titanium superconducting cable has had to be produced to form the magnetic cores.

Altogether some 100 companies in Europe are involved in manufacturing the magnet components. The greatest task was the move from the prototyping and pre-series phase to large-scale production. This has been met successfully and three industrial sites, in France, Germany and Italy, are manufacturing about 10 magnets each week.

B-factory achieves record luminosity

On 19 February the Belle experiment running at Japan’s KEKB accelerator, the KEK B-factory, accumulated a record integrated luminosity of 1 fb-1 in a single day, corresponding to roughly 1 million BBbar meson pairs.

KEKB’s design luminosity of 1 x 1034 cm2 s-1 was first reached in May 2003. Since then the record has regularly been broken and on 15 February a new peak of 1.516 x 1034 cm2 s-1 was achieved. On average the KEKB luminosity is about 20% higher than it was a year ago. During operation of the TRISTAN accelerator at KEK from 1987 to 1995, the total integrated luminosity seen by the VENUS detector was 400 pb-1. Belle is now collecting the same amount of data in less than half a day.

Most of the performance increase is due to the novel scheme of continuous beam injection used at KEKB in which the detector keeps taking data while the electron and positron beams are being injected into the accelerator. This was previously thought to be almost impossible owing to the large noise introduced by the injected beams. However, the KEK accelerator group has developed a sophisticated scheme of continuous beam injection, while the detector group has also developed an electronics system that is more tolerant to noise.

Russian team builds biggest MDT chambers for muon spectrometer

The Institute for High Energy Physics (IHEP) in Protvino, Russia, is producing some of the largest and most challenging chambers for the muon spectrometer of the ATLAS detector at CERN. Monitored drift-tube (MDT) chambers come in a variety of sizes, but the 192 chambers now being produced at IHEP include 16 with a length of 6.3 m and between them incorporate 60,000 precision MDT tubes.

CCEnew3_04-05

MDTs have been constructed in many institutes in Europe, the US, Russia and China, but this is the first time that chambers of this size have been successfully produced. Despite the huge size of the chambers, the 50 μm thick anode wires are positioned to better than 20 μm. Production in Protvino is expected to finish by mid-April.

Tsunami earthquake detected in ATLAS cavern

During the Christmas break, the hydrostatic level sensors (HLSs) in the ATLAS cavern revealed a new facet of their capabilities. Installed by the CERN survey group to monitor any deformation or movement of the structure on which the detector feet rest, these sensors with submicrometre resolution coupled to the heavy ATLAS mechanical infrastructure took on the function of a seismograph.

CCEnew4_04-05

The signals recorded by the sensors are shown in the figure, which reveals two perturbations, one on 23 December starting at 15.45 GMT and the other on 26 December at 01.23 GMT. Seeing these unusual readings raised the question of whether they were connected with the earthquake off the Indonesian coast that gave rise to the devastating tsunami.

The Geneva Centre for the Study of Geological Risks was duly contacted and it confirmed that the earthquake off the coast of Sumatra, which measured 9.0 on the Richter scale, was indeed responsible for the large peak recorded at CERN. When a seismic event occurs, the resulting vibrations spread out in all directions and two types of wave can be distinguished: primary waves, which propagate through the earth at speeds of 6-8 km s-1, and the slower waves that are confined to the surface of the Earth (such as the horizontal Love wave, which can cause structural damage to buildings).

The epicentre of the Sumatra earthquake was some 9000 km from CERN and happened at 00:59 GMT (07:59 local time). The primary waves need about 20 minutes to reach the ATLAS cavern, which is consistent with the first perturbations recorded by the sensors at 01.23 GMT on 26 December.

The earlier, smaller perturbation is linked to another earthquake measuring 8.1 on the Richter scale, which is thought to have been correlated with the earthquake of 26 December. It happened at 14.59 GMT on 23 December north of Macquarie Island (between Australia and Antarctica), much further away from CERN.

First neutrinos head for MINOS

The Main Injector Neutrino Oscillation Search (MINOS) experiment was officially inaugurated in a ceremony at Fermilab on 4 March. MINOS is the latest weapon in the arsenal of neutrino-oscillation searches. Its main goal is to measure the largest difference in mass-squared between different neutrino species (Δm223) with an accuracy of 10% – more than a factor of two better than it is known today.

CCEnew5_04-05

MINOS takes over from the KEK to Kamioka (K2K) experiment in Japan, which has finished taking data with a similar set-up. The unique feature of MINOS, however, is its 1.5 T magnetic field. This enables the experiment to distinguish positively and negatively charged tracks and hence discriminate between neutrinos and antineutrinos.

MINOS uses a neutrino beam produced by Fermilab’s Neutrinos at the Main Injector (NuMI) facility, where 120 GeV protons from the Main Injector hit a graphite target, producing hadrons including pions. A “horn” focusing system selects positive pions, which then decay in a 700 m-long decay pipe. After passing through a beam absorber, the beam comprises mostly muon neutrinos. An important advantage of the system is that the energy of the neutrinos can be tuned by moving the horn focusing system.

The neutrino beam is aimed at the MINOS “far” detector, located in the Soudan Underground Laboratory in northeastern Minnesota, some 730 km away from Fermilab. The laboratory is 700 m underground in an old iron mine. To reduce errors by measuring directly the beam composition and neutrino energy spectrum, a “near” detector is also incorporated in the experiment 1 km from the target. It is essentially a miniature of the 6000 t far detector.

An important milestone was reached on 4 December 2004, when the first beam reached the target hall. The horns were powered in January and the near detector has already recorded its first events.

• MINOS is a collaboration of 200 scientists, engineers, technical specialists and students from 32 institutions in Brazil, France, Greece, Russia, the UK and the US.

US budget changes priorities for HEP

On 8 February the White House released its budget proposal for the financial year 2006. The science and technology budget of the US Department of Energy has been reduced overall by about 3.8% compared with 2005, whereas the budget for high-energy physics (HEP) is reduced by about 3%. The proposal is pending approval by Congress.

The HEP programme for 2006 has been structured in such a way “not only to maximize the scientific returns on our investment in these facilities, but also to invest in R&D now for the most promising new facilities that will come online in the next decade”. This has necessitated some prioritization.

The planned operations, upgrade and infrastructure for the Tevatron at Fermilab are cited as the highest priority, with a high priority also given to operations, upgrades and infrastructure of the B-factory at SLAC. However, B-factory operations will be terminated by 2008 at the latest. Support for a leadership role for US research groups in the physics programme for the Large Hadron Collider at CERN will also continue to be a high priority, and the preconceptual R&D needed to explore the nature of dark energy will continue in 2006.

A major casualty is the engineering design of the B Physics at the Tevatron (BTeV) experiment, which was scheduled to begin in 2005 as a new “major item of equipment” and will instead be terminated by the end of 2005. The reasons given are the timescale and the “lesser scientific potential” compared with other projects, although it is “still important scientifically”. Support was strong only if the project could be completed by 2010, which is “not feasible given schedule and funding constraints”.

Support for a future electron-positron linear collider, however, has increased relative to 2005 for “the continued international participation and leadership in linear collider R&D and planning by US scientists”. R&D for other new accelerator and detector technologies, particularly in the emerging area of neutrino physics, will also increase.

WASA finds a new home at COSY

The Wide Angle Shower Apparatus (WASA) detector, currently at the CELSIUS facility of The Svedberg Laboratory (TSL) in Uppsala, Sweden, is to find a new home. CELSIUS was commissioned in 1983, using the hardware of CERN’s ICE ring, and its experimental programme will end in summer 2005. The WASA detector, built in the 1990s by a collaboration between Sweden, Poland, Germany, Russia and Japan, will then be relocated to the Cooler Synchrotron (COSY) ring at the Forschungszentrum Jülich (FZJ) in Germany.

CCEnew6_04-05

WASA is a fixed-target 4π detector comprising a central part and a forward part. The central detector, built around the interaction point, is designed mainly for the detection of the decay products of π0 and η mesons: photons, electrons and charged pions. It consists of an inner drift chamber, a superconducting solenoid and a caesium-iodide calorimeter. The forward detector, designed to detect target recoil and scattered beam particles, consists of 11 planes of plastic scintillator counters and proportional counter drift tubes. The target consists of a beam of frozen hydrogen or deuterium pellets about 25 μm in diameter, which will allow luminosities of up to 1032 cm-1 s-1 in interactions with the circulating beam at COSY.

The transfer of WASA to COSY will be mutually beneficial. Photon detection is important for understanding the physics of hadronic reactions, since many of the produced mesons and excited baryonic states have a significant number of decay branches into multi-photon final states. This calls for a detector with a wide-acceptance electromagnetic calorimeter. Until now, such a detector has been missing from COSY, and WASA fits the bill nicely.

WASA will also benefit from the higher energy of the COSY beam compared with CELSIUS, which is well above the threshold for η’ production in proton-proton interactions. (COSY offers beam momenta of up to 3.7 GeV c-1 with polarized and cooled proton and deuteron beams, whereas CELSIUS can only go up to 2.1 GeV c-1.) WASA will be shipped to the Forschungszentrum Jülich this autumn and the experimental programme is expected to start in the beginning of 2007. Once at COSY, the WASA detector offers an opportunity to deepen our understanding of non-perturbative quantum chromodynamics (QCD) through a precise study of symmetry breaking and very specific investigations of hadron structure.

For example, the η and η’ decays that vanish in the limit of equal-light quark masses (for example η’→ηππ) allow the exploration of explicit isospin symmetry-breaking in QCD. Furthermore, precision measurements of rare η and η’ decays can be used to obtain new limits on the breaking of the charge, parity and time symmetries or their combinations. Last but not least, WASA at COSY can contribute significantly to testing the various models offered to explain exotic and crypto-exotic hadrons – such as the light scalar mesons a0/f0(980), pentaquarks like the Θ+ or hyperon resonances like the Λ(1405) – through precise measurements of decay chains and couplings to other hadrons.

Another promising process where precise measurement can confront theoretical predictions is the isospin-violating process dd→απ0. Pioneering measurements have already been performed at the Indiana Cooler. At COSY such studies can be extended to higher energies and, in particular, to the reaction dd→απ0η, which should be driven by the isospin-violating a0-f0 mixing.

COSY can produce more than 106 η’ mesons per day, and their subsequent hadronic, radiative, leptonic and forbidden decays can be detected by WASA. The expected event rates will substantially increase world statistics.

• The WASA-at-COSY project is a collaborative effort between many institutions, in particular TSL and FZJ. The project currently comprises 137 members from 24 institutes in seven countries.

bright-rec iop pub iop-science physcis connect