Comsol -leaderboard other pages

Topics

The LHC’s new frontier

On 30 March, just one month after CERN’s Large Hadron Collider (LHC) had restarted for 2010, control rooms around the 27 km ring echoed with cheers as the machine produced the first collisions at a record energy of 7 TeV in the centre of mass. Over the following days, the LHC experiments started to amass millions of events during long periods of running with stable beams, thus beginning an extended journey of exploration at a new energy frontier.

Golden orbit

The first taste of beam for 2010, on 28 February, was at 450 GeV, the injection energy from the SPS (CERN Courier April 2010 p6). Operating the LHC at this energy soon became routine, allowing the teams to perform the tests necessary to optimize the beam orbit and the collimation, as well as the injection and extraction procedures. This work resulted in the definition of the parameters for collimation and machine protection devices for a “golden” reference orbit, with excellent reproducibility. It showed that the collimation system works as designed, with beam “cleaning” and other losses exactly where expected at the primary collimators. The tests also involved systematic and thorough testing of the beam dumping system, which proved to work well. One mystery about the beam still remains: the “hump”, a broad frequency-driven beam excitation that leads to an increase in the vertical beam size. Nevertheless, the teams measured good beam lifetimes, and in just under two weeks, on 12 March, the operators were able to ramp the beams up to 1.18 TeV, the highest energy achieved in 2009 (CERN Courier January/February p24).

A short technical stop followed, during which the magnet and magnet protection experts continued their campaign to commission the machine to 6 kA – the current needed in the main magnets to operate at 3.5 TeV per beam. A key feature is the quench protection system (QPS): on detecting the first indication that part of a superconducting magnet coil is turning normally conducting – quenching – it forces the whole coil to become normally conducting, thereby distributing the energy of the magnet current over its whole length. In the induced quench, the huge amount of energy stored in the coil is safely extracted and “dumped” into specially designed resistors. At the same time the QPS triggers a mechanism to dump beam within three turns.

In 2009, the system was fully commissioned to 2 kA, the current necessary to reach an energy of 1.18 TeV. However, during the final stages of hardware commissioning in February, multiple induced quenches sometimes occurred during powering off. It turns out that the system can be “over-protective”, because transient signals unrelated to real quenches can trigger controlled quenches. Once the problem was understood, the machine protection experts decided that they could solve it by changing thresholds in the magnet circuits equipped with the new QPS. For those parts with the old QPS, however, the solution required a modification to cards in the tunnel (to delay one of the transients). While awaiting full tests before implementing these changes (later in April), the experts took the decision to go ahead and run the main bending magnets up to 6 kA, but to limit the ramp rate to 2 A/s to reduce the transients.

By midday on 18 March the operators had the green light to try ramping to 6 kA at the agreed slow rate, first testing this ramp rate to 2 kA (1.18 TeV). By 10.00 p.m., after one or two interruptions, they had succeeded with a “dry ramp”, without beams. Work on beam injection and orbit corrections followed before a ramp started at around 4.00 a.m. with a low-intensity probe beam – about 5 × 109 protons in a single bunch per beam. Gradually, the current in the main bending magnets rose from 460 to 5850 A and at about 5.23 a.m. the beams reached 3.5 TeV – a new world record at the first attempt. Already, measurements suggested a lifetime for both beams of as long as 100 hours.

Over the following days, machine studies at 3.5 TeV continued, with ramping becoming routine and the orbit stable and reproducible. Just as at 450 GeV, machine protection and collimation studies were important before the step to collisions at 3.5 TeV could take place. Only then would the operators be able to declare “stable beam” conditions so that the experiments could turn on the most sensitive parts of their detectors to observe events at the new high-energy frontier.

With some critical work still remaining, on 23 March the management took the decision to announce that the first attempt at collisions would take place a week later, on 30 March, with invited media in full attendance. The following days were not without difficulties, as a variety of hardware problems occurred, and each morning saw a change of plans in the run-up to the first collisions at 3.5 TeV per beam. Further planned running at 450 GeV and studies at higher intensities at 450 GeV were among the casualties. By 29 March, however, the operators had performed all of the essential tests for declaring “stable beams” at 3.5 TeV and were able to run the machine for several hours at a time, with a non-colliding bunch pattern to avoid premature collisions in any of the experiments.

Finally at 4.00 a.m. on 30 March the LHC team was ready to inject beam in a colliding bunch pattern with two bunches per beam, in preparation for collisions. After the necessary checks, they began the ramp to 3.5 TeV at 6.00 a.m. just as the first media were arriving on CERN’s Meyrin site. Twice, part of the machine tripped during the ramp and twice the operators had to ramp back down and re-establish beam at 450 GeV. The third attempt, however, from 11.52 a.m. to 12.38 p.m., was successful. Then, after some final measurements on the beam, it was time to remove the “separation bumps” – the fields in corrector magnets that are used to keep the beams separated at the interaction points during the ramp.

At 12.52 p.m. the operators announced that they were happy with the beam orbit and were about to remove the separation bumps. At 12.57 p.m. online beam and radiation monitors indicated that the CMS experiment had collisions, confirmed almost immediately by the online event displays. At 12.58 p.m. the ATLAS collaboration saw the experiment’s first events at a total energy of 7 TeV burst onto the screens of the crowded control room. At 12.59 p.m. the LHCb experiment saw its first collisions and by 1.01 p.m. the ALICE website was announcing its first 7 TeV events. At the same time, the two smaller LHC experiments also reported collisions. The TOTEM experiment saw tracks in one of its particle telescopes, while the LHCf calorimeters recorded particle showers with more than 1 TeV of energy. CERN’s press office swiftly told the assembled media and reported the successful observation of collisions at 7 TeV total energy to the world: the LHC research programme had finally begun.

At 1.22 p.m. the operators declared “stable beams” and the LHC provided three and a half hours of collisions before an error caused the beams to dump safely. During this time, CMS, for example, collected around 600,000 collision events and LHCf detected as many as 30,000 high-energy showers.

The following week saw several prolonged periods of “quiet” running during which the experiments continued to accumulate events. These were interspersed with further tests and machine development work. There were also scheduled periods for access to the tunnel, for example to begin work on the QPS to allow a faster ramp rate of 10 A/s. There was also the almost inevitable “down time” that arises with any complex machine.

The challenge ahead for the LHC team is to increase the luminosity, which is a measure of the collision rate in the experiments. The design luminosity is 1034 cm–2 s–1, but in these early days the experiments are seeing around 1027 cm–2 s–1. It is a case of learning to walk in small steps before running flat out, especially considering the total energy of the beams at higher luminosities. This is why the first investigations are always performed with the low-intensity “probe” beam.

The luminosity depends not only on how many particles are in the beams, but also on making sure that the beams collide head-on exactly at the interaction points. Ensuring that this happens is the goal of dedicated “luminosity scans” in horizontal and vertical beam position for the experiments at each of the four interaction points. In addition, the LHC operators can reduce the beam size at the collision points by “squeezing” the betatron function that describes the amplitude of the betatron oscillations about the nominal orbit. On 1 April the first squeeze from 11 m down to 2 m was successfully performed in several steps at Points 1 and 5, where ATLAS and CMS are located (together with LHCf and TOTEM, respectively).

By the end of the first week of April, each of the four large experiments had accumulated some 300 μb–1 of data, corresponding to several million inelastic events. When optimized and with about 1.1 × 1010 protons per bunch, they were recording data at a rate of up to around 120 Hz and finding a luminosity lifetime of well in excess of 20 hours. The first stage of the journey to attain 1 fb–1 before a long shutdown towards the end of 2011 had begun.

Gell-Mann: quantum mechanics to complexity

CCman1_04_10

To celebrate Murray Gell-Mann’s many contributions in physics in his 80th year, the Institute of Advanced Studies at Nanyang Technological University and the Santa Fe Institute jointly organized the Conference in Honour of Murray Gell-Mann, which took place in Singapore on 24–26 February. Aptly entitled “Quantum Mechanics, Elementary Particles, Quantum Cosmology and Complexity” to focus on Gell-Mann’s achievements in these fields, the three-day conference was a festival of lectures and discussions that attracted more than 150 participants from 22 countries. Those in attendance included many of Gell-Mann’s former students and collaborators. For a select few this was their second visit to Singapore, having attended the 25th Rochester Conference held there 20 years ago.

The meeting began with a brief scientific biography of Gell-Mann presented by his close collaborator Harald Fritzsch of Ludwig-Maximilians University, who highlighted his main achievements. During the 1950s Gell-Mann worked with Francis Low on the renormalization group and with Richard Feynman on the V-A theory of weak interaction. The application of the SU(3) symmetry group to classify hadrons led Gell-Mann to predict the existence of the Ω particle in 1961; its subsequent discovery in 1964 paved the way to his receiving the Nobel Prize in Physics in 1969. Gell-Mann and George Zweig independently proposed quarks as the constituents of hadrons in 1964.

Gell-Mann studied the current algebra of hadrons together with various co-workers. In 1971 he introduced light-cone algebra together with Fritzsch, as well as the colour quantum number for quarks. A year later they proposed the theory of QCD for the strong interaction. In 1978 Gell-Mann, Pierre Ramond and Richard Slansky proposed the seesaw mechanism to explain the tiny neutrino masses. Then, in around 1980, Gell-Mann switched his interest towards the foundations of quantum mechanics, quantum cosmology and string theory.

Multifaceted

Gell-Mann’s interests extend beyond physics – he loves words, history and nature. He has moved between disciplines that include historical linguistics, archaeology, natural history and the psychology of creative thinking, as well as other subjects connected with biological and cultural evolution and with learning. He currently spearheads the Evolution of Human Languages Program at the Santa Fe Institute, which he co-founded.

The subsequent talks by Nicholas Samios of Brookhaven National Laboratory and George Zweig of Massachusetts Institute of Technology (MIT) were very entertaining. They touched on the historical background that led to the discovery of the Ω – predicted by Gell-Mann’s Eightfold Way – and to the quark model of hadrons, and were accompanied by interesting anecdotes and photographs. Zweig related the origin of the terminology “quark” and how the battle between “aces” and quarks unfolded.

There were several talks on recent advances in various theoretical and experimental aspects of QCD as well as on the Higgs boson. CERN’s John Ellis discussed the Higgs particle and prospects for new physics at the LHC. Nobel laureate C N Yang of Tsinghua University gave a talk on his recent work on the ground-state energy of a large one-dimensional spin-1/2 fermion system in a harmonic trap with a repulsive delta-function interaction, based on the Thomas-Fermi method. Gerard ‘t Hooft of Utrecht University – another Nobel laureate – presented a possible mathematical relationship between cellular automata and quantum-field theories. This may provide a new way to interpret the origin of quantum mechanics, and hence a new approach to the gravitational force.

CCman2_04_10

Gell-Mann himself ended the first day’s sessions with interesting personal recollections and reflections on “Some Lessons from 60 Years of Theorizing”. His main observations can be summarized as follows. First, every once in a while, it is necessary to challenge some widely conceived idea, typically a prohibition of thinking in a particular way – a prohibition that turns out to have no real justification but holds up progress in understanding. It is important to identify such roadblocks and get round them. Second, it is sometimes necessary to distinguish ideas that are relevant for today’s problems from ones that pertain to deeper problems of the future. Trying to bring the latter into today’s work can cause difficulties. Finally, doubts, hesitation and messiness seem to be inevitable in the course of theoretical work (and experiments too, sometimes). Perhaps it is best to embrace this tendency rather than organizing over and around it, for example, by publishing alternative contradictory ideas together with their consequences, and leaving the choice between them until a later time.

The following day and a half covered a variety of topics. Rabindra Mohapatra of the University of Maryland discussed neutrino masses and the grand unification of flavour. Further talks focused on the origins of neutrino mixing and oscillations, as well as on what the LHC might reveal about the origin of neutrino mass.

John Schwarz of Caltech gave an interesting review of the recent progress in the correspondence between anti-de Sitter space and conformal field theory, which is one of the most active areas of modern research in string theory. He focused mainly on the testing and understanding of the duality and the construction and exploration of the string theory duals of QCD. Other talks reported on string phenomenology and string corrections in QCD at LHC. Itzhak Bars of the University of Southern California described a gauge symmetry in phase space and the consequences for physics and space–time.

The sessions on quantum cosmology covered topics on black holes, dark matter, dark energy and the cosmological constant. These included a talk by Georgi Dvali of New York University, who discussed the physics of micro black holes.

The main sessions of the conference ended with a talk by Nobel laureate Kenneth Wilson of Ohio State University, a former student of Gell-Mann. He touched on a fundamental problem: could the testing of physics ever be complete? According to Wilson, in the real world no law about continuum quantities such as time, distance and energy can be established to be exact through experimental tests. Such tests cannot be carried out today, and cannot be done in the foreseeable future – although estimates of uncertainties can be improved in future. Wilson also took part in a discussion session with school teachers and students in a Physics Education Meeting held in conjunction with the conference.

CCman3_04_10

The parallel sessions on particle physics, cosmology and general relativity attracted presentations by more than 30 speakers, many of whom were young physicists from Asia (China, China (Taiwan), India, Indonesia, Iran, Japan, Malaysia and Singapore). There was also a special session on quantum mechanics and complexity featuring invited speaker Kerson Huang of MIT who gave a talk on stages of protein folding and universal exponents.

• To mark the occasion of Gell-Mann’s 80th birthday, the publication of Murray Gell-Mann: Selected Papers, edited by Harald Fritzsch (World Scientific 2010), was launched during the conference.

Reviews of Accelerator Science and Technology Volumes 1 and 2

By Alexander W Chao and Weiren Chou (eds), World Scientific. Volume 1 Hardback ISBN 9789812835208, £55 ($99). E-book ISBN 9789812835215, $129. Volume 2 Hardback ISBN 9789814299343, £81 ($108).

The development of accelerators represents one of the great scientific achievements of the past century. The objective of this new journal – Reviews of Accelerator Science and Technology – is to give readers a comprehensive review of this dynamic and interesting field and of its various applications. The journal documents the tremendous progress made in the field of accelerator science and technology and describes its applications to other domains. It also assesses the prospects for the future development and use of accelerators.

The history and function of accelerators is told from its beginnings and extends to future projects in an extremely competent and complete approach, as the authors have themselves contributed in many ways to the success of the fields presented. The journal shows clearly how progress in science is strongly coupled to advances in the associated instruments, allowing us to see beyond the macroscopic world – into the finer structure of matter – and to apply these instruments to fields such as elementary particle physics, medicine and industry. From the structure of cells, genes and molecules to the Standard Model of elementary particles, the scientific developments are recounted back to the early development of these versatile instruments.

CCboo1_04_10

Volume 1 presents the history of accelerators, from the first table-top machines to the colliders of today and those being planned for the future. It is written in a fashion that serves as a historical account while also providing the scientific and technical basis for a deeper understanding. The volume transmits the spirit of this truly multidisciplinary and international field. With an excellent bibliography for each chapter, together with the historical development of the science of accelerators and the contributions by key figures in the field, it succinctly describes the overall history and future prospects of accelerators.

The articles in this volume include a review of the milestones in the evolution of accelerators, a description of the various types of accelerators (such as electron linear accelerators, high-power hadron accelerators, cyclotrons, colliders and synchrotron-light sources) as well as accelerators for medical and industrial applications. In addition, various advanced accelerator topics are discussed – including superconducting magnets, superconducting RF systems and beam cooling. There is also a historical account of the Superconducting Super Collider, and an article on the evolution, growth and future of accelerators and of the accelerator community.

CCboo2_04_10

Volume 2 focuses on the first of many specific subfields, its theme being medical applications of accelerators. Out of about 15,000 accelerators of all energies in existence today, more than 5000 are routinely used in hospitals for nuclear medicine and medical therapy. The articles in this volume feature overviews of the medical requirements written by physicians; a review of the status of radiation therapy, radioisotopes in nuclear medicine and hospital-based facilities; a detailed description of various types of accelerators used in medicine; and a discussion on future medical accelerators. In addition, one article is dedicated to a prominent figure of the accelerator community – Robert Wilson – in recognition of his seminal paper of 1946, “Radiological Use of Fast Protons”.

These first two volumes of Reviews of Accelerator Science and Technology are timely, instructive and comprehensive. The journal is well laid out and, thanks to the many informative photos and diagrams, it is easy also to read. It is written in an impartial and balanced way and covers the achievements made at several laboratories around the world. To ensure the highest quality, the articles are written by invitation only and the submitted papers have all been peer-reviewed. An editorial board consisting of distinguished scientists has also been formed to advise the editors.

The journal represents an excellent balance between a historical account of the developments in the field and the technical challenges and scientific progress made with such machines. Volume 2 in particular comes at an auspicious moment because the synergies between the science behind accelerators and the related spin-offs, such as the applications of accelerators to fight disease, are of great importance to human health – with a profound impact on our society.

In conclusion, the journal is a tribute to accelerators and the people who developed them. It appeals to the expert as well as to all scientists working and applying the use of accelerators. Active scientists and historians of science will appreciate this chronicle of the development of accelerators and their key role in the progress of various domains during the past century. It should be on the shelf of every scientist working with accelerators and of those with an interest in the history and future directions of accelerators and their applications. I hope that it also inspires students to look deeper into accelerator science and technology and to choose this field as a career.

CERN – the knowledge hub

CCvie1_04_10

If you ask 10 people working at CERN how they would describe what CERN is in a single sentence, the chances are that you will get 10 different answers.

Most people think of CERN, first and foremost, as an accelerator “factory” and a provider of facilities for the experiments. Some would state that it is a high-profile research organization, as well as a formidable training centre. Others will emphasize that it is an attractive and responsible employer. Finally, some may point out that CERN is, among other things, a strong, internationally recognized “brand”.

They are all correct in some way because CERN is a complex system with manifold activities and worldwide impact, to an extent that is sometimes hard to appreciate from an in-house perspective. Personally, I like to think of CERN as a “knowledge hub”. In fact, despite people’s different views on what CERN is, they are all part of its knowledge-exchange network.

Knowledge from universities, research institutes and companies flows into CERN through the people who come to participate in its activities. New knowledge is generated at CERN and knowledge then flows out, for example through R&D partnerships and technology transfer and through those who leave.

CERN is actually more than a hub because it plays the role of an active “catalyser” in the exchange of knowledge. As a concrete example, in February 2010 the “Physics for Health in Europe” workshop took place at CERN. It brought together more than 400 participants – both medical doctors and technology experts from the physics community. Medical experts attending expressed their appreciation that CERN had organized the workshop, acknowledging the need for such cross-cultural and interdisciplinary events, which cannot easily be organized at a national level. The value of CERN both as a provider of technologies and as a catalyst for the community was widely recognized. There are, of course, many other activities where CERN makes similar contributions towards global endeavours, for example, the Open Access initiative and the deployment of a computing Grid infrastructure in Europe.

Some of the knowledge exchanges taking place across CERN’s network are structured, explicit and therefore easy to track. This is the case, for example, with technology-transfer activities, which are typically formalized through contracts that give third parties access to CERN’s intellectual property portfolio. Other knowledge-exchange processes are tacit or informal. For example, knowledge transfer through people’s mobility from CERN towards European companies is hard to track in a systematic way.

The CERN Global Network aims to facilitate knowledge exchange across the various groups described above and to improve the visibility of partnership opportunities related to CERN’s activities. It will also enable CERN to gather data on knowledge transfer through mobility.

This Global Network will welcome former and current members of the CERN personnel (including users), companies from CERN’s member states, universities and research institutes. It will deliver a database of members and a dedicated website, providing information about partnership and knowledge-sharing opportunities (training, new R&D projects, transferable technologies, jobs etc) across the community. It will also foster the creation of special interest groups and organize events at CERN.

The scope of the Global Network is broader than a typical “alumni” association because it aims to build and reinforce links between all of the key players in the knowledge-exchange process – be they individuals or institutions. Interactions between individuals will generate a CERN-specific social and professional network, while interactions between individuals and institutions will create value in areas such as recruitment by linking job seekers with potential employers. Finally, interactions between institutions will enable the exchange of best practice in specific thematic areas.

As a last point, I would like to stress that the importance of knowledge transfer through day-to-day exchanges with the general public cannot be overemphasized. No doubt most readers of this article are routinely asked by ordinary citizens to explain what CERN is. In these circumstances we are all acting as ambassadors for CERN, endowed with the responsibility to remove misconceptions about our field and to explain the role of fundamental research as a driver for innovation.

Contributing to communication with the general public is everyone’s responsibility – the CERN Global Network will provide its members with information about the CERN-related projects that make an impact on society and that can be used to illustrate how CERN concretely delivers value to the community, in addition to its contribution to the advancement of basic science.

Facilitating and catalysing knowledge exchanges are among the most valuable benefits that we at CERN can deliver to society. A few words from George Bernard Shaw suffice to illustrate why: “If you have an apple and I have an apple, and we exchange these apples, then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.”

• For more about the CERN Global Network, see http://globalnetwork.cern.ch.

Claudio Parrinello, head of knowledge and technology transfer, CERN.

A Zeptospace Odyssey: A Journey into the Physics of the LHC

by Gian Francesco Giudice, Oxford University Press. Hardback ISBN 9780199581917, £25 ($45).

CCboo1_04_10

If you are of the opinion that working physicists do not care about the history of their discipline or that theorists, like Gian Giudice, have no interest in the details of the experimental machines and detectors, this book will come as a surprise. The same is true if you share the view that it is not possible to describe the frontiers of modern physics – including the most speculative ones – to non-experts in a way that is both faithful and comprehensible. This book does all of that and is enjoyable reading, with the important information that it carries mixed in with many fun facts and anecdotes of all sorts. Not to mention the spot-on explanatory metaphors that are distributed profusely throughout almost every chapter.

One quality of this book is its comprehensive character, with its contents in three approximately equal parts. The first gives a brief but inspired history of particle physics, from J J Thomson’s discovery of the electron up to the setting of the Standard Model, without neglecting James Clark Maxwell, quite appropriately, or even Galileo Galilei and Isaac Newton. In the author’s words, the expected “results for the LHC” – surely the main inspiration of the book – “cannot be appreciated without some notion of what the particle world looks like”. The central section “describes what the LHC is and how it operates” – no more or less than that – in a successful effort to make clear the astonishing technological innovations involved in the LHC enterprise. This is useful reading for everybody, including politicians.

Last but not least, the third section “culminates with an outline of the scientific aims and expectations of the LHC”, addressing the central open issues in particle physics and beyond. Here Giudice is also not afraid to venture into the description of interesting theoretical speculations, while always keeping a sober view of the overall subject. “We do not know what lies in zeptospace and the LHC has just started its adventure” is the very last sentence of the book, which I fully support. By the way, “a zeptometre is a billionth of a billionth of a millimetre”, not quite but almost the distance that will be explored for the first time by the LHC: hence “zeptospace”.

The coming of the LHC is certainly the main inspiration of the book. The awe and excitement brought on by the start of LHC operation exudes from all its pages. But I think there is more to it than that. There is a view of what I like to call “synthetic physics”, that is the physics that aims to describe nature, or at least some part of it, in terms of few principles and few equations. In many respects the book pays tribute to “synthetic physics”. This is what determines the unity of its style and of its arguments. To whom do I recommend its reading? To everybody, experts or non-experts. I would in particular encourage young people, starting from those who are nearing the end of their high-school studies. I am sure that their efforts will be highly rewarded, not to mention the pleasure they will find. I believe, and I certainly wish, that this book will become required reading for anyone interested in scientific human endeavour, in the reality of our world.

Gli anelli del sapere. The Rings of Knowledge

by Federico Brunetti (ed.), Editrice Abitare Segesta. Hardback ISBN 9788886116930, €50.

CCboo2_04_10

With 350 photographs in about 150 pages, The Rings of Knowledge is a beautiful photographic collection interspersed with some text, whose role in putting over the message is almost peripheral. The book is bilingual, English and Italian, and so is aimed at an international audience.

The authors and editor have succeeded in illustrating the Italian contribution to CERN and the LHC. The book particularly emphasizes the role of the Italian National Institute for Nuclear Research (INFN) and its involvement in leading worldwide scientific projects, of which the LHC is the flagship. The pride in contributing to the “LHC era” – as defined by the president of INFN, Roberto Petronzio, in the foreword – sometimes causes the authors to fall into the trap of excessive self-celebration. Statements such as “The LHC could not have been realized without Italy’s collaboration” apply equally to many other member states of CERN and could be badly perceived by an international readership.

The most distinctive feature is that Federico Brunetti, the editor, is an architect and photographer from the Industrial Design, Arts and Communication Department of Milan Politecnico. The chapters “The LHC between science and architecture” and “Physics as design” show his astonishment with the “enormous machines”, “enormous dimensions”, the “never-before-seen extremes of the place”. However, they also show that communication is an issue for any specialized discipline, including architecture.

The wording of these chapters is complex and the concepts are described with a sort of jargon that makes reading difficult. In particular, the concept of “beauty” in design and in physics is mentioned several times and in different places but is never really presented in a clear way. This is a pity because it would have been an interesting point to develop in a comprehensible way.

Back to the main point of the book: I found the photographs really amazing. The square layout is based on Fibonacci’s geometric series and shows the link between physics and design. Unfortunately, even this fascinating point is not clearly explained in the text. For example, one caption on page 25 helps the reader’s intuition but simpler phrasing would significantly increase the overall enjoyment of the book.

AMS begins journey to Florida

CCnew1_04_10

The Alpha Magnetic Spectrometer (AMS) left CERN on 12 February on the first leg of its journey to the International Space Station (ISS). The special convoy carrying the experiment arrived four days later at the European Space Agency’s research and technology centre (ESTEC) in the Noordwijk region of the Netherlands, after a journey of 600 km. AMS will then fly to the Kennedy Space Center in Florida before lifting off aboard the space shuttle, probably in July.

With an 8.5-tonne load filled with superfluid helium, this was no ordinary shipment. The AMS detector was inserted into a support structure and surrounded by protective plastic foil before being placed in a box and loaded onto the special vehicle, which also carried a diesel generator running a pump to keep the helium at 2 K. Some 20 members of the AMS collaboration followed the detector on its journey.

The detector components of AMS were constructed by an international team with significant contributions from France, Germany, Italy, Portugal, Spain and Switzerland, as well as from China, China (Taipei) and the US. Assembly then took place at CERN with help from the laboratory’s engineering services.

From 4–9 February, the detector was put through its paces using a test beam from the Super Proton Synchrotron (SPS). The AMS team used protons from the SPS to check the detector’s momentum resolution. It also tested the detector’s ability to distinguish electrons from protons. This is important for the measurement of cosmic rays, 90% of which are protons and constitute a natural background for other signals. The AMS collaboration will be looking for an abundance of positrons and electrons from space, one of the possible markers for dark matter.

The next step in testing has now moved to ESTEC, where ESA’s thermovacuum room simulates the vacuum in space. Here the team will test the detector’s capacity to exchange heat and thus maintain its thermal balance. This is essential to the functioning of the detector’s electronics and superconducting magnet, the first of its kind to be launched into space. If all goes well, towards the end of May the detector will embark on a journey to NASA’s Kennedy Space Center aboard a C5 aircraft owned by the US Air Force. There, it will board the last-but-one flight of the space shuttle Discovery (mission STS-134). Lift-off is scheduled for July.

Once docked to the ISS, AMS will examine fundamental issues about matter and the origin and structure of the universe directly from space. Its central aim is to search for dark matter and antimatter. Its data will be transmitted from the ISS to Houston and on to the detector control centre at CERN, as well as to a number of regional physics-analysis centres that have been set up by the collaborating institutes.

Beam returns for the start of the longest run

CCnew2_04_10

In the early hours of 28 February, beam was circulating again in the LHC at the start of operations that are scheduled to continue for the next 18 to 24 months. The objective is to deliver 1 fb–1 of data to the experiments at 7 TeV in the centre of mass, so providing enough data to make significant advances across a range of physics channels.

This restart followed a break of several weeks during which LHC teams carried out essential work to ensure the correct functioning of the magnets at high current. They verified several thousand channels of the new quench-protection system (nQPS) and measured precisely the resistance of the 10,000 splices connecting the magnets, finding no unacceptably anomalous values.

Once work on the nQPS had been completed, it was the turn of the hardware-commissioning team to test the main dipoles and quadrupoles of the LHC up to a current of 6 kA. This will allow proton collisions at 7 TeV in the centre of mass during the coming run. After completing these tests the hardware-commissioning team handed over to the operations team.

Their initial operations centred on tests without beam to verify the correct functioning of all of the machine systems in unison. Late on 27 February the LHC was ready to receive beam again, and by just after 4.00 a.m., protons had circulated in each direction round the machine. For the operations team this was the beginning of a period of optimization at the injection energy of 450 GeV, investigating parameters associated with beam injection, collimation and the beam-abort systems etc., as well as studies to improve the beam lifetime. The first ramps – without beam – were made on the evening of 9 March to an energy of 1.18 TeV, the highest level that was achieved in 2009. Ramps to 3.5 TeV per beam are scheduled for later in March, with collisions planned for the end of the month.

• For the latest LHC news see, www.cern.ch/bulletin, as well as www.cern.ch and www.twitter.com/cern.

ALBA’s booster accelerator in operational test

CCnew3_04_10

The first operational tests of the booster accelerator for the ALBA synchrotron light source in Barcelona took place in January. The results show that all of the components, subsystems and equipment perform according to specification. This was the main objective of the tests, which were performed over a short period so as not to interfere excessively with the installation of the storage ring and the beamlines.

ALBA is a third-generation synchrotron light-source facility co-financed by the Catalan and Spanish governments, which is now in its last phase of construction at Cerdanyola del Vallès, Barcelona. The facility, which is being constructed and operated by the CELLS consortium, will provide synchrotron light of world-class quality (brilliance) for research in a range of scientific disciplines.

The facility consists of three accelerators – the linac, booster and storage ring – and seven beamlines (in the initial phase). The linac creates the electron beam and accelerates it up to 100 MeV. The beam is then injected into the second accelerator, the booster, where the energy increases to 3 GeV. This is the critical part of the accelerator chain. Ultimately, the beam will be injected into the storage ring and stored to produce synchrotron light.

The operational test of the booster began on 21 December 2009, when beam was transported from the linac to the booster for the first time. After a Christmas shut-down, tests recommenced on 11 January and on the following day, beam made the first turns round the machine – and produced the first synchrotron light seen in Spain. On 19 January the ALBA team was able to accelerate the beam to 600 MeV and two days later they achieved 2.7 GeV with a circulating beam of 0.7 mA. The two-week test finished on 24 January to allow for further installation work.

ALBA’s booster was completely designed, assembled and tested by the ALBA team, making it the first high-energy accelerator built in Spain. With its design, it also has the smallest emittance (beam size) in the world for an accelerator of its kind. The next milestones will be the operation of the storage ring, in the autumn, followed by the operation of the complete facility, expected for the beginning of 2011.

GSI team first to trap superheavy element

CCnew4_04_10

An international team of researchers at GSI Darmstadt has successfully contained atoms of nobelium (atomic number 102) in an ion trap. This is the first time that a superheavy element has been trapped. It allowed the team to measure the mass of three isotopes of the element with unprecedented accuracy.

The measurements took place in the SHIPTRAP facility at GSI, which combines an ion trap with the Separator for Heavy Ion reaction Products (SHIP) – a velocity filter that has already been used in the discovery of six superheavy elements at GSI. SHIPTRAP consists of a stopping cell, an RFQ buncher and a double Penning trap system inside a 7 T superconducting magnet. The cell of high-purity helium stops and thermalizes radioactive nuclei, which SHIP delivers at energies of a few 100 keV/u. The stopped ions are extracted into the RFQ structure where they are cooled, accumulated, and bunched. The ions then enter the first Penning trap, where they are selected according to mass by a buffer-gas cooling technique with resolving power of about 50,000. Finally, a purified sample of ions is injected into the second Penning trap where their mass is determined precisely via their cyclotron frequency.

CCnew5_04_10

The nobelium ions were produced in fusion reactions of a beam of 60Ca and a target of lead foil (206–208Pb). They were then separated out from the beam in the SHIP velocity filter, passing at a rate of less than one ion per second (in the case of 252No) into the stopping cell. The decelerated ions were extracted into the RFQ within a few milliseconds and then injected in pulses into SHIPTRAP’s double Penning trap system.

By directly comparing the cyclotron frequency of the nobelium ions in SHIPTRAP with the frequency of precisely known reference ions, the research team was able to determine the masses of the nobelium isotopes 252–254No to uncertainties of about 10 keV/c2 – a relative precision of 0.05 ppm (Block et al. 2010). 254No is now the heaviest radionuclide to have its mass measured directly and 252No is the lowest-production-rate radionuclide whose mass has been measured with a Penning trap.

These mass values provide new, accurate reference points in the region of superheavy elements. The technique also holds promise for identifying elements on the way to the predicted “island of stability”. One of the next goals of SHIPTRAP is to extend these accurate mass measurements to the transactinide region, starting with long-lived rutherfordium isotopes that terminate decay chains originating from Z = 116.

• Element 112, first observed at GSI in 1996, now officially carries the name copernicium and the chemical symbol Cn, after approval by the International Union of Pure and Applied Chemistry (IUPAC). The name honours scientist and astronomer Nicolaus Copernicus. The discoverers had suggested Cp as the symbol, but as this abbreviation has other scientific meanings, they agreed with IUPAC on Cn. Copernicium is the heaviest element officially recognized by IUPAC.

bright-rec iop pub iop-science physcis connect