Bluefors – leaderboard other pages

Topics

CESR set to bow out of B-particle business

After some 20 years of making milestone contributions to the physics of B-particles (containing the fifth beauty
or “b” quark), the CLEO collaboration at Cornell’s CESR electron-positron collider is now looking to step down
in energy.

It has identified a broad programme of important physics that can be studied in the tau/charm threshold region with a luminosity of 3 x 1032 per cmper s. Simultaneously, Cornell is studying the feasibility of converting CESR to such a facility.

A workshop will be held at Cornell on 5-7 May to provide an opportunity for the elementary particle physics community to explore the opportunities provided by the CLEO III detector operating in this energy/luminosity region. More information is available at http://www.lns.cornell.edu, or e-mail spoke@mail.lns.cornell.edu.

With high luminosity B-factories now in operation at PEP-II, SLAC, Stanford, and KEKB in Japan, CESR is looking for alternative research topics.

Isotope source reaches full energy

The new Isotope Separator and Accelerator (ISAC-I) at the Canadian TRIUMF laboratory in Vancouver has
reached its maximum energy on schedule.

On 21 December 2000 a beam of singly charged helium-4 ions was accelerated through the ISAC drift-tube linac to 1.5 MeV per nucleon.

A major component of the laboratory’s programme over the coming years will be the planned upgrade to ISAC-II, boosting the energy to 6.5 MeV per nucleon and extending the menu to cover isotopes in the mass range 30-150.

A forthcoming article will cover ISAC commissioning in more detail.

Micropattern detectors promise a big future

cernmicro1_2-01

The pioneering work at the beginning of the 20th century by Thomson, Rutherford and Geiger, just after the discovery of electromagnetic radiation, focused attention on the development of tools to detect this radiation. The single-wire proportional counter (Geiger counter) became an essential physics tool.

Georges Charpak’s 1968 invention of the multiwire proportional chamber (MWPC) ushered in a new era, with a major impact on high-energy physics. The main performance features of the MWPC are a space resolution of few hundred micrometres, the two- and three-dimensional localization of incident radiation, excellent energy resolution and rates of a few kilohertz per square millimetre.

Other MWPC applications include crystal diffraction, beta chromatography and dual-energy angiography. A low dose X-ray digital radiography scanner based on the MWPC developed at Novosibirsk is currently being used routinely in hospitals in Russia and France.

Despite this success, some basic limitations of MWPCs restrict their use at high rates. The wire spacing defines the best achievable position accuracy and gives two-track resolution to about 1 mm. Electrostatic instability limits the stable wire lengths. The widths of the induced charges define the pad response function and, at high rates the accumulation of positive ions spoils rate capability.

The advent of high-luminosity colliders demands fast, high-performance position-sensitive detectors. Key requirements are unsurpassed position localization; good two-track, two-dimensional and time resolutions; and the ability to withstand hostile radiation over a considerable period of time.

The microstrip generation

cernmicro2_2-01

The invention of the microstrip gas chamber (MSGC) by Anton Oed marked another era of gaseous detectors. An MSGC comprises a pattern of thin anode and cathode strips on an insulating substrate with a pitch of a few hundred micrometres. With a drift electrode and with appropriate potentials applied, the electric field is such that positive ions are removed immediately from the avalanches, increasing rate capability by some two orders of magnitude.

The salient features are localization accuracy of some 30 µm, double-track resolution of 400 µm and good energy resolution. Long-term and magnetic field operations have been demonstrated, and these devices have found applications in many fields of X-ray spectrometry digital radiography and high-energy physics.

Difficulties began when MSGCs were exposed to the highly ionizing particles that are usually present in a high luminosity machine. These particles deposit in the detection volume almost three orders of magnitude as much charge as a minimum ionizing particle.

In the case of microstrip detectors, the anode-cathode distance is small compared with that in a wire chamber, and, with electric fields at the tip of the streamer and along the surface being high, the streamer is likely to be followed by a voltage- and ionization density-dependent discharge. The charging up of surface defects, long-lived excited states and overlapping avalanches seems to be the culprit, lowering the discharge limits of operation. With this insight, several novel designs appeared.

The detection of micropatterns

Advances in photolithography and the application of silicon foundry techniques heralded a new era in the design and fabrication of “micropattern detectors”. The microdot (introduced by Biagi) is the ultimate gaseous pixel device, with anode dots surrounded by cathode rings. Although achieving gains of about 1 million, it does not discharge, probably because the field emulates the 1/r field of an anode wire.

A very asymmetric parallel plate chamber, the MICROMEGAS detector invented by Charpak and Giomataris, takes advantage of the behaviour at high fields (100 kV/cm) in several gas mixtures, thus achieving stable operation with the minimum of ionizing particles at high gains and rates. Large MICROMEGAS detectors are being made and tested for the COMPASS experiment at CERN.

A new detector invented by Lemonnier is the CAT (compteur à trous). This comprises a narrow hole micromachined into an insulator metallized on the surface, which acts as the cathode, while the metal at the bottom of the hole constitutes the anode. With appropriate potentials and a drift electrode, this scheme acts as a focusing lens for the drifting electrons left in the wake of ionizing radiation.

Removing the insulator leaves the cathode as a micromesh, which, with a thin gap between it and the read-out electrode, emulates CAT operation (hence microCAT or µCAT). This structure offers gains of several 104. Another option uses ingenious read-out from “virtual pixels” made by current sharing, giving 20 times as fine resolution compared with the read-out cell and 400 times as many virtual pixels. The µCAT combined with the pixels is called the VIP (figure 1).

A new concept of gas amplification introduced by Sauli in 1996 is the gas electron multiplier (GEM) manufactured using printed circuit wet etching techniques. A thin (50 µm) Kapton foil clad on both sides with copper is perforated and the two surfaces maintained at a potential gradient, thus providing the necessary field for electron amplification (figure 2).

Coupled with a drift electrode above and a read-out electrode below, it acts as a high-performance electron amplifier. The essential features of this detector are that amplification and detection are decoupled and the read out is at zero potential. Charge transfer to a second amplification device opens up the possibility of using a GEM in tandem with an MSGC or a second GEM.

With these developments and a better understanding of the discharge phenomena, new detectors have appeared: Micro-Wire, an extension of the µDOT in the third dimension, Micro-Pin Array, Micro-Tube, Micro Well, Micro Trench and Micro Groove. All aim for minimal insulator between the anode and cathode to reduce discharges. The Micro-Tube uses a combination of laser micromachining and nickel electroplating, and gives an electric field that increases rapidly at the anode, similar to the µDOT. However, there is no insulating material on the direct line of sight from cathode to anode. These features are predicted to lead to higher gas gains, better stability with fewer discharges and the reduction of charging effects.

Listening for the music of gravity

cerngrav1_2-01

Two of the world’s interferometric gravitational wave detectors – Japan’s TAMA project and the LIGO laboratory in the US – have recently attained two significant milestones in the ongoing quest to detect the waves produced by gravity in transit.

The 300 m TAMA interferometer near Tokyo, which achieved extended servo lock in 1999 (see The quest for gravitational waves), has continued to pioneer the field. Last September, in a thrilling two-week test run, TAMA logged 160 h of interferometer “in-lock” operation, proving the viability of the technique as well as its reliability.

Not only did the TAMA team by far surpass its target of 100 h in-lock, but it was able to maintain individual lock periods lasting as long as 12 h, in spite of the dauntingly noisy local seismic environment.

During the in-lock time of an interferometer – equivalent to the storage time of a particle collider – the instrument listens intently for the whispers of the universe. Longer in-lock time means more efficient listening capabilities.

Greater sensitivity

Even more important is the fact that in less than two years, the TAMA group has lowered the noise floor by more than two orders of magnitude, to an astonishing 10-21 strain sensitivity. In a small frequency range, this is close to the design value.

The strain sensitivity of an interferometric gravitational wave detector is analogous to the luminosity (collision rate) of a particle collider. The greater the sensitivity, the greater the range of the interferometer, and the shorter the time it must wait to register any signal of a stellar collision or an explosive event.

LIGO laser beams

cerngrav2_2-01

Meanwhile, in the desert near Hanford, Washington State, the LIGO group has been chasing laser beams down the twin 2 km beam tubes of the first, and smallest, of LIGO’s three gravitational wave interferometers.

Various partial configurations have been tested during recent months, including a recombined Michelson-Morley interferometer with Fabry-Perot arms. Lock periods of several hours were solidly achieved, although without the power recycling system that was planned for the final configuration.

Then, in a momentous week last October, both Fabry-Perot arms of the complete Michelson-Morley interferometer came alive, with the power recycling cavity fully operational.

The latter is so named because it gathers the return laser power from the interferometer and recycles it. When operating, this configuration stores up to 30 times as much power in the arms, thereby increasing the interferometer’s high-frequency sensitivity. One drawback is that the power recycling mirror makes the interferometer controls an order of magnitude more difficult.

As these breakthroughs were being made, the atmosphere at LIGO was tense but cautiously optimistic. The locks achieved were somewhat unstable – lasting a few minutes at the most – and the stored beam power was always very low. Nevertheless, it was reassuring to see that everything was working, even if only for a short length of time.

The importance of these locks for LIGO is comparable with the first flight of the Wright brothers, of which an observer said: “It doesn’t stay up long. It isn’t very far off the ground. But it does fly!”

A long and challenging road must still be travelled before gravitational wave physicists can hope to snare these elusive ripples. The TAMA group must improve the interferometer’s high-frequency sensitivity curve by enhancing laser stability and introducing their power recycling cavity, while also implementing a state-of-the-art seismic attenuation system on the low-frequency end.

With the start-up of the Hanford 2 km interferometer, LIGO’s commissioning team has only just begun the work that will be required to render it fully operational. The LIGO crew will have to follow the same troubleshooting and improvement route as their Japanese colleagues. A series of engineering runs will be used to collect data and pinpoint any sources of trouble.

High energies and high altitudes

cerncosmic1_1-01
cerncosmic2_1-01

Before the advent of particle accelerators, cosmic rays were the source of many major physics discoveries. A physicists realize that the energies attainable using terrestrial accelerators represent only a tiny slit of nature’s wide window, cosmic ray physics is becoming fashionable once more.

A recent meeting of cosmic ray physicists in La Paz, Bolivia, examined the research programme of the Chacaltaya Laboratory, one motivation being the recent declaration of support for the laboratory by the Centro Latinoamericano de Fisica.

At 5220 m (540 g/cm2 barometric pressure), Chacaltaya is the highest continuously operating cosmic ray research laboratory in the world and barely an hour’s drive from the outskirts of La Paz.

Several speakers recalled the early history of the laboratory, including the discovery of the pion in nuclear emulsions by Lattes, Occhialini and Powell (seen also in emulsions exposed on the Pic du Midi in the Pyrenees). Other unusual emulsion chamber observations include the still mysterious Centauro events.

Local experiments

Several reports came from experimental collaborations currently working at Chacaltaya. The Bolivian Airshower Joint Experiment (BASJE) group’s work was discussed by H Yoshii (Ehime) and others. This collaboration, operating since 1962, currently utilizes an array of about 80 scintillation detectors in a 60 x 60 m array.

An interesting result is the observation of a galactic anisotropy in arrival directions, an enhancement of the primary cosmic ray flux between 270° and 300° galactic longitude. As the events showing this anisotropy have a normal muon content, the conclusion is that the effect is due to primary cosmic ray nuclei and not gamma rays.

A Ohsawa (Institute for Cosmic Ray Research, Tokyo) and N Ohmori (Koshi) discussed results from the Saytama Yamanashi-San Andres (SYS) collaboration detector – an array of 32 emulsion chambers, each 0.25 m2 and each containing 15 cm of lead plates, mounted on a thick scintillator. This array opens up the study of the hadron component of an air shower core in correlation with the electromagnetic component.

Among the conclusions based on their observations are a confirmation of a breakdown of Feynman (kinematic) scaling at energies of 1 PeV (1015 eV) and above; a scarcity of hadrons compared with the predictions from simulations; and a decrease in inelasticity with increasing energy (inelasticity is the average value of 1-K, where K is the fraction of the incident energy retained by the most energetic hadron in the final state of a nuclear interaction).

M de Petris (Rome) described the Millimetre Observations from a high-altitude 2.6 m ground-based telescope (MITO) infrared telescope programme at 3500 m on Testa Grigia, Italy, for infrared astronomy over the 30-0.6 mm wavelength range. One stated goal is “the multifrequency observation of rich large clusters of galaxies to estimate, together with X-ray information, the Hubble constant”. As these wavelengths are strongly absorbed by atmospheric water vapour, which falls off more rapidly than barometric pressure with altitude, such observations profit particularly from high-altitude locations, and the advantages of locating such a facility at Chacaltaya were pointed out.

Nuclear content

cerncosmic3_1-01

The SYS group also reported that the (logarithmic) average nuclear composition of the primary cosmic rays at an energy of 10 PeV was about that of oxygen. This is somewhat lower than the BASJE data had suggested; BASJE (and others) argue that the composition at this energy is heavier, with iron nuclei dominant. On the other hand, studies of the atmospheric Cherenkov radiation accompanying air showers have suggested a lighter nuclear spectrum at this energy.

J Stamenov (Bulgarian Academy of Sciences) and J Procureur (Bordeaux) presented a proposal to select showers generated by primaries with different masses but the same energy to a possible future extended air shower array at Chacaltaya.

Other activities on Mount Chacaltaya that were discussed included a Search for Light Magnetic Monopoles (SLIM) with a 100 m2 (expandable to 400 m2 in the future) passive nuclear track detector consisting of three sheets of CR-39 track etch detector sheets, three Makrofol sheets and an aluminium absorber, as reported by S Cecchini (Bologna). SLIM could also be sensitive to “strangelets”, as discussed by G Wilk (Warsaw). A strangelet (strange quark matter or “nuclearite”) would be a nuclear object containing approximately equal numbers of up, down and strange quarks.

Searches for high-energy gamma-ray point sources with the SYS array were presented by R Bustos (La Paz) and searches for gamma-ray “bursters” with the INCA experiment (Investigation on Cosmic Anomalies) by S Vernetto (Turin). Results of both searches were negative. However, Vernetto showed that the INCA experiment at Chacaltaya has provided the lowest upper limits on gamma-ray bursters from a GeV-TeV ground-based experiment. Also reported were high-altitude studies of background ionizing radiation at Chacaltaya by S Cecchini, and a neutron monitor installation by E Cordaro (Santiago).

Bending the knee

cerncosmic4_1-01

The cosmic ray spectrum changes behaviour at a “knee” (between 1 and 10 PeV). Studying this effect, the flux is too low for direct observation using small balloon- or satellite-borne detectors, and earth-based observations must interpret indirect observables such as air-shower (electron and gamma ray) components, muons and hadrons in terms of primary interaction and the mass composition.

The simulations for the primary interaction are based on long extrapolations from sub-TeV accelerator data and are hence uncertain. The primary cosmic rays range from protons to iron nuclei. The problem is that the observable consequences of the composition and characteristics of the primary interaction are interrelated.

The KASCADE array at Karlsruhe is probably the most densely instrumented air shower array in operation. A Haungs (Karlsruhe) reported on recent work. Analysis of the hadron and electromagnetic components shows that the spectrum of light primaries shows a clear break (the knee), while the heavy primary spectrum is relatively smooth. This results in an increase in average nuclear mass with energy through the knee region. The Karlsruhe group has found no existing Monte Carlo model to be totally satisfactory at energies of 10 PeV and above and is tuning a promising new candidate.

An analogue of the KASCADE array at the elevation of Chacaltaya was presented by O Saavedra (Turin). A central hadron calorimeter/muon detector of perhaps 100 m2 with finely instrumented upper layers would be surrounded by a dense air shower array, including additional muon detectors. Although an ambitious project, this could go a long way towards resolving the confusion and contradictions surrounding the composition, the primary spectrum and the physics around the knee of the cosmic ray spectrum.

The Japanese and the Russians have exploited an emulsion chamber array in the Pamirs. Aspects of this research were presented by M Tamada (Kinki, Osaka), T Yuldashbaev (Tashkent), and S Slavatinsky and A Borisov (Lebedev Institute, Moscow). They compared their data with a quark-gluon string model, concluding that inelasticity increases with energy, contrary to the SYS conclusions.

Slavatinsky also emphasized unusual phenomena seen in their emulsion chambers – “aligned events”, “halo events” and the “long-flying component”. The latter was interpreted by Z Wlodarczyk (Kielce) as possible evidence for strangelets, which, he noted, might also be the source of the mysterious Centauros. J N Capdevielle (Paris) and S Nikolsky (Moscow) presented possible evidence for quark-gluon plasma in emulsion chambers.

Can carbon nanotubes handle high-energy particles?

cernnano1_1-01

Nanoparticles are small pieces of matter that, at least in one dimension, consist of tens to thousands of atoms and have widths of the order of a few nanometres. Nanotechnology – the production, study and application of nanoparticles – could be set to play a major role in technological development.

Carbon nanotubes are typical nanoparticles, which were first produced in 1991 after the 1985 discovery of large football-like “fullerene” molecules of pure carbon-60 atoms that are found in the sediments from the laser irradiation of graphite.

Nanotubes can be imagined as rolled-up graphite crystallographic planes with carbon atoms separated by 1.3 nm at the vertices of honeycomb hexagons. Nanotubes are either concentric multiwalled (MWNT) or single-walled (SWNT) structures. The latter are characterized by two numbers, n and m, which determine not only the diameter and geometry, but also many physical parameters, such as the metallic or semiconductor nature.

In 1996 it was shown that the metallic SWNTs with n=m=10 are produced with 75% efficiency in the sediment of graphite after laser irradiation. SWNTs have a length of up to 200 µm and a diameter of 1.38 nm, and a few hundred of them form compact ropes with 17 nm between the axes.

The unique properties of MWNTs and SWMTs promise wide application in various fields of industry and science. Owing to their smaller size, nanotube chips could replace transistors in electronic devices, providing higher densities of logic units. With their more effective field emission of electrons, they could be better than liquid crystals in advertisement, electronic and television displays.<textbreak=Channelling>In addition to other possible developments, during 1996-1997 V V Klimov, V S Lethokhov, L A Gevorgian and we considered the theory of the channelling of high-energy particles in SWNTs. (In conventional channelling, charged particles are steered by the electromagnetic forces in crystals.)

cernnano2_1-01

The fact that the diameter of SWNTs is larger than the distances between the crystallographic planes, and the possibility that, in the near future, nanotubes will be available in lengths greater than single crystals, underline the potential advantage of nanotubes.

Indeed, the much lower SWNT electron density results in a sharp decrease in multiple scattering. As a result, the channelling protons and positrons moving near the SWNT axis suffer less dechannelling.

The classical and quantum theories of the radiation of particles channelled in SWNT, which are valid for energies both above and below about 100 MeV respectively, demonstrate that, taking into account medium polarization, X-ray production has specific threshold and spectral properties, and it can also serve as a source of intense quasimonochromatic photon beams. Periodically deformed nanotubes as microundulators can provide intense linearly and circularly polarized spontaneous and stimulated radiation in the X-ray region.

The final curtain falls on LEP

cernnews3_1-01
cernnews4_1-01

After a concerted push by physicists to extend the running of CERN’s LEP 27 km electron-positron collider into 2001, the decision has been taken to close the machine for good.

The original masterplan foresaw closure in September after 11 years of running, but unprecedented collision energies above 200 GeV enabled several of the experiments to glimpse signs of the long-awaited Higgs particle, which endows all other particles with mass. LEP was thus given a six-week “stay of Higgs execution”.

Extra evidence seen during the extension shows the tentative LEP Higgs signal to have a a mass of around 115 GeV. These candidate events are dominated by the production in LEP’s electron-positron collisions of a Higgs particle and a Z boson, although evidence for other Higgs production mechanisms is also seen. However, the combined effect falls slightly short of what is required to claim an outright Higgs discovery.

Physicists continued to push for additional LEP running, but on 17 November the committee of CERN’s governing body, Council, gave its verdict. Council expressed its “recognition and gratitude for the outstanding work done by the LEP accelerator and experimental teams. It has taken note of the request by many members of the CERN scientific community to continue LEP running into 2001 and also noted the divided views expressed in the scientific committees consulted on this subject. On the basis of these considerations and in the absence of a consensus to change the existing programme, Council supports the director-general in pursuing the existing CERN programme.”

The “existing programme” meant the plan to close LEP in 2000 and focus resources on the LHC proton collider, to be installed in the LEP tunnel and scheduled to start running in 2005.

While the Higgs evidence was compelling, the mechanisms involved were also at the extreme end of LEP’s energy reach, so the physicists could only touch the Higgs candidates with their fingertips. There were doubts that additional running would substantially consolidate the signal. This, coupled with the need to keep LHC construction on schedule, led to the final controversial decision.

It is rare that major particle accelerator machines close at CERN. The usual pattern is that new machines stand on the shoulders of their predecessors. The electrons and positrons for LEP came via a chain of more mature machines, including the 28 GeV PS synchrotron, which, when it first came into operation in 1959, was briefly the world’s highest-energy accelerator, and the 450 GeV SPS synchrotron, commissioned in 1976.

One CERN machine that closed was the laboratory’s first accelerator, the 600 MeV synchrocyclotron (SC), commissioned in 1957 and turned off in 1990. The SC was a stand-alone accelerator and did not serve as an injector for any later machine, but it did spawn the ISOLDE on-line isotope separator, subsequently transferred to the PS Booster.

Another past CERN machine was the Intersecting Storage Rings – the world’s first proton collider. It was commissioned in 1971 but was switched off early in 1984 to release resources for LEP construction.

The LEAR low-energy antiproton ring, commissioned in 1983, was terminated in 1996 to free resources for the LHC.

The 1983 groundbreaking ceremony for LEP was a major milestone in CERN’s history. LEP was the initial reason for the 27 km tunnel excavated under the Swiss-French frontier, but it was understood almost from the outset of LEP preparations in the mid-1970s that the tunnel would be a valuable piece of physics research real estate that would one day house a more powerful machine – LHC.

In a final proud gesture before the curtain came down on its part in the play, LEP, operating at high energies unforeseen until late in its career, revealed its intriguing hints of the long-awaited Higgs particle. This physics now has to await confirmation and consolidation at Fermilab’s Tevatron proton-antiproton collider and/or LHC.

The step from LEP to LHC is a natural progression. LEP’s disappearance is not an abrupt closure of a thriving machine – it gave all that was expected of it, and more.

LHC will take its particles from the Booster-PS-SPS chain of synchrotrons. As well as this physical supporting infrastructure, LHC will stand on metaphorical shoulders – the extinct ISR – for it was here that CERN first acquired collider expertise; the additional skills acquired at the SPS, which operated as the world’s first proton antiproton collider (1981-1990); and, of course, LEP.

CMS calorimeter begins to take shape

cernnews5_1-01

Research institutes all over the world are busy providing components for the experiments at CERN’s Large Hadron Collider.

The hadronic calorimeter for the Compact Muon Solenoid (CMS) experiment reached an important milestone on 27 October 2000 when the first half of its barrel structure was test assembled at the Felguera Construcciones Mecánica SA plant in the Asturias region of Spain. The structure has since been dismantled and transported to CERN, where the sensitive elements – scintillator tiles with optical-fibre read-out – will be installed over the coming year.

CMS has adopted a conventional scintillator/absorber sandwich architecture for its hadronic calorimeter. However, because the device will be installed inside the experiment’s powerful solenoid magnet, it is subject to unconventional constraints. The absorber material – brass – has been chosen because it is non-magnetic, cheap, easy to machine and sufficiently dense for the job. The brass has been supplied by companies in both Bulgaria and the UK.

Read-out is via optical fibres that are coupled to hybrid photodiodes that are capable of functioning within the CMS magnet. Design and construction of the CMS barrel hadronic calorimeter are the responsibility of Fermilab in the US, which awarded the contract for the building the structure to Felguera.

Full assembly of the first half of the barrel structure allowed the experiment to verify that the exacting tolerances that are required between the 25 tonne wedges to eliminate gaps had been achieved. This ensures that the calorimeter has full azimuthal coverage without any cracks.

The next step is the installation of the scintillator elements at CERN. Production of these elements is more than 50% complete, and delivery from Fermilab to CERN is well advanced. Installation of the scintillator into the first half barrel should be completed by the autumn, with the second half following by the end of 2002. The integration and testing of the detector will take place in 2003.

Pulling the trigger on LHC electronics

A decade ago, many aspects of the preparations for the major physics experiments for CERN’s Large Hadron Collider (LHC; scheduled to come into operation in 2005) appeared to be problematic. Particularly so were the front-end data acqusition electronics, which needed to be resistant to continual bombardment by high-energy collision products, and the trigger, which has to sift through the collisions for interesting results and reduce the original collision rate by a factor of about 10 million.

Fully aware that these preparations had to anticipate and take optimum advantage of rapid developments in modern microelectronics, a systematic research and development programme led by CERN’s Detector Research and Development Committee started to address these challenges and pave the road towards the LHC experiment proposals. However, the real work started once the projects had been approved. Progress has been marked by successive workshops of the Electronics Board of the LHC Experiments Committee. The most recent of these – the sixth in the series – was held in Cracow, Poland, in September 2000. The previous workshop was held in Snowmass, Colorado, and the workshop for 2001 is scheduled to take place in Stockholm, Sweden.

New challenges

cerntrigger1_1-01

At the time of the first of such workshops (1995-1997), designers were concentrating on individual designs, studying the basics of radiation effects and learning how to work with industry. Custom integrated circuit technology offered them the possibility of placing data buffering and first-level trigger filtering functions directly onto the detectors.

However, the process of developing mature integrated circuits turned out to be more time-consuming than had been anticipated. Testing was a bottleneck (one full design cycle takes about a year to complete), and using the special industrial technologies required to resist the LHC radiation environment involved a number of unexpected complications.

Meanwhile, the LHC community had realized that the experimental caverns would also present risks for the electronics, which needed to be, at the very least, “radiation tolerant”. Making sure that the many commercial off the-shelf components that were envisaged for the caverns were sufficiently radiation tolerant was a complex and difficult task.

All of these challenges led the electronics development teams to search for new solutions, working hand-in-hand with other research labs and with industry. Today we have several designs ready, or almost ready, for production. A good example is the ABCD3T silicon tracker front-end chip for the giant ATLAS detector (figure 1).

This implements a binary read-out architecture in a 0.8 µm BiCMOS silicon-on-insulator technology, specially developed to meet the challenges of the LHC environment. Its equivalent for the big Compact Muon Solenoid (CMS) experiment – APV25 – reads out the data in analogue form and is implemented in a commercial 0.25 µm CMOS technology using a radiation-tolerant design technique developed for LHC. Both function according to the target design specifications.

Viable solutions

In addition, complex radiation-hard chips for the read-out of pixel detectors of all LHC experiments have now started to appear, a good example being the pixel read-out chip developed for the ALICE and LHCb experiments. This contains more than 13 million transistors and can be configured for tracking applications in ALICE or for particle identification in LHCb’s hybrid Ring Imaging Cerenkov detectors.

Prototype read-out boards for calorimeters and muon chambers have been shown to meet the functionality and performance specifications needed. However, not all of the components are sufficiently resistant to radiation yet, and further work is needed to optimize this aspect of the designs before launching production.

New problems, which are known as “single event upsets”, appeared as side-effects of the evolution of microelectronics technologies towards smaller feature sizes. Smaller charges are more easily perturbed, which often requires modifications to the design (e.g. selecting a more robust component, or using error detection/correction techniques).

For the optical read-out links, 1310 nm edge-emitting lasers have been selected for the analogue read-out of the CMS tracker, while 850 nm vertical cavity surface-emitting lasers appear to be a good choice for digital read-out links. Altogether it appears that viable solutions for the front-end detector electronics and optical read-out links have been found.

cerntrigger2_1-01

However, it will not be easy to integrate all of these components into the compact LHC detectors. Recent LHC Electronics Board (LEB) workshops tried to underline the system aspects of the LHC electronic designs: power supplies and distribution; grounding and shielding; cooling; timing and synchronization; and controls. The development groups are gradually attacking these issues and presenting solutions.

Figure 2 shows a small test set-up using prototypes of almost all of the elements foreseen for the read-out of the CMS tracker. To study issues such as timing and synchronization, it is being operated in a CERN test beam with a 25 ns structure, mimicking the 40 MHz LHC bunch-crossing frequency.

Nevertheless, large prototype system tests still have to be built and operated to prove that everything is understood and under control. This will require the mastery of complex test and assembly processes using state of-the-art technologies, for which available staffing is not always sufficient.

cerntrigger3_1-01

Close partnerships with industry and the adoption of common solutions (for crates, power supplies, cables, controls, etc) wherever possible will help to alleviate this problem. Other difficult issues include the maintenance and obsolesence of technologies over the relatively long timescale of the LHC project – already the very rapid pace of microelectronics advances has forced redesigns of several developments using new technology.

Triggering and data acquisition make up one of the extraordinary challenges facing detector designers at the high-luminosity LHC (figure 3). The LHCb trigger and data acquisition system must be able to handle trigger rates approaching 1 MHz, while the ALICE experiment operating in ion-ion collision mode must be able to handle large event sizes.

In the case of ATLAS and CMS, when LHC operates in proton collision mode at its nominal design luminosity of 1034 cm-2s-1, an average of 25 events are expected to occur at each bunch crossing, while bunch crossings will occur at a rate of 40 MHz. This input rate of 109 interactions every second must be reduced by a factor of at least 107 to about 100 Hz – the maximum rate that can be archived by the on-line computer farm.

LEP reaps a final harvest

cernnews1_12-00

CERN’s LEP electron-positron collider stubbornly refused to lie down quietly in 2000. The world’s largest synchrotron storage ring was scheduled to be closed forever at the end of September, and dismantled to make way for the LHC proton collider to be built in the same 27 km tunnel.

However, with tantalizing glimpses of the long-awaited Higgs particle appearing at the last gasp, LEP was accorded a six-week stay of “Higgs execution”. The machine duly finished its 2000 run on 2 November. In a specially convened meeting of the LEP Experiments Committee on 3 November, LEP physicists revealed the fruit of these extra few weeks of autumn running.

The Higgs particle, which breaks electroweak symmetry and endows particles with mass, is the missing link in the Standard Model of particle physics, and a major objective at LEP. As LEP’s energy was increased over the years, more and more Higgs territory has been covered without finding any signs of the elusive particle – until this year.

To boost the energy of LEP’s particles, from 1996 the machine was equipped with superconducting radiofrequency accelerating cavities. The remarkable success of this scheme, together with astute planning and skilled machine operations, have enabled LEP to reach collision energies of up to 209 GeV, beyond its planned energy horizon.

cernnews2_12-00

For the past several years, LEP has been running in exactly the energy band where the Higgs had been most expected. Each time the energy was increased, physicists held their breath. As data started to accumulate above 206 GeV late this summer, a few electron-positron events suggested Higgs production with a mass of around 114 -115 GeV.

In these events, a LEP electron-positron pair could produce a Higgs back-to-back with another particle. However, the Higgs signals are right at the extreme edge of LEP’s kinematic reach, and are difficult to disentangle from more common processes, notably the production of Z and W particle pairs.

The particles can decay in a number of ways. The initial candidates saw four confined sprays (“jets”) of particles, two from the Higgs. However, other decay patterns are possible, and the recent run has also revealed events with two slices of “missing mass”, indicating the production of two otherwise invisible neutrinos, and other signals.

In the combined results of the four LEP experiments – ALEPH, DELPHI, L3 and OPAL – confidence in the candidate Higgs signal therefore slightly increased as a result of the autumn run, but still fell short of the level needed to claim a physics discovery. The experiments therefore requested a further extension of LEP running in 2001.

However, with the LHC knocking loudly on the door, this has been ruled out. LEP has run for the last time, and its ultimate findings point the way to future physics at the LHC.

At the 3 November meeting where the latest LEP results were disclosed, there was an ovation for the LEP operations team which had delivered the high-energy goods and provided such a cliffhanger finish to the machine’s 11 year career (see LEPilogue: marking the end of an era).

bright-rec iop pub iop-science physcis connect