Comsol -leaderboard other pages

Topics

The incurable attraction of physics

A noble gas, a missing scientist and an underground laboratory. It could be the starting point for a classic detective story. But a love story? It seems unlikely. However, add in a back-story set in Spain during General Franco’s rule, plus a “eureka” moment in California, and the ingredients are there for a real romance – all of it rooted firmly in physics.

CCgom1_01_13

When Spanish particle-physicist Juan José Gómez Cadenas arrived at CERN as a summer student, the passion that he already had for physics turned into an infatuation. Thirty years later and back in his home country, Gómez Cadenas is pursuing one of nature’s most elusive particles, the neutrino, by looking where it is expected not to appear at all – in neutrinoless double-beta decay. Moreover, fiction has become entwined with fact, as he was recently invited to write a novel set at CERN. The result, Materia Extraña (Strange matter), is a scientific thriller that has already been translated into Italian.

Critical point

“Particle physicists were a rare commodity in Spain when the country first joined CERN in 1961,” Cecilia Jarlskog noted 10 years ago after a visit to “a young and rapidly expanding community” of Spanish particle physicists. Indeed, the country left CERN in 1969, when Juan was only nine years old and Spain was still under the Franco regime. Young Juan – or “JJ” as he later became known – initially wanted to become a naval officer, like his father, but in 1975 he was introduced to the wonders of physics by his cousin; Bernardo Llanas had just completed his studies with the Junta de Energía Nuclear (the forerunner of CIEMAT, the Spanish research centre for energy, the environment and technology) at the same time as Juan Antonio Rubio, who was to do so much to re-establish particle physics in Spain. The young JJ set his sights on the subject – “Suddenly the world became magic,” he recalls, “I was lost to physics” – and so began the love affair that was to take him to CERN and, in a strange twist, to write his first novel.

The critical point came in 1983. JJ was one of the first Spanish students to gain a place in CERN’s summer student programme when his country rejoined the organization. It was an amazing time to be at the laboratory: the W and Z bosons had just been discovered and the place was buzzing. “I couldn’t believe this place, it was the beginning of an absolute infatuation,” he says. That summer he met two people who were to influence his career: “My supervisor, Peter Sonderegger, with whom I learnt the ropes as an experimental physicist, and Luis Álvarez-Gaume, a rising star who took pity on the poor, hungry fellow-Spaniard hanging around at night in the CERN canteen.” After graduating from Valencia University, JJ’s PhD studies took him to the DELPHI experiment at CERN’s Large Electron–Positron collider. With the aid of a Fulbright scholarship, he then set off for America to work on the Mark II experiment at SLAC. From there it was back to CERN and DELPHI again, but in 1994 he left once more for the US, this time following his wife, Pilar Hernandez, to Harvard. An accomplished particle-physics theorist, she converted her husband to her speciality, neutrino physics, thus setting him on the trail that would lead him through the NOMAD, HARP and K2K experiments to the challenge of neutrinoless double-beta decay.

The neutrinoless challenge

Established for 15 years as professor of physics at the Institute of Nuclear and Particle Physics (IFIC), a joint venture between the University of Valencia and the Spanish research council (CSIC), he is currently leading NEXT – the Neutrino Experiment with a Xenon TPC. The aim is to search for neutrinoless double-beta decay using a high-pressure xenon time-projection chamber (TPC) in the Canfranc Underground Laboratory in the Spanish Pyrenees. JJ believes that the experiment has several advantages in the hunt for this decay mode, which would demonstrate that the neutrino must be its own antiparticle, as first proposed by Ettore Majorana (whose own life ended shrouded in mystery). The experiment uses xenon, which is relatively cheap and also cheap to enrich because it is a nobel gas. Moreover, NEXT uses gaseous xenon, which gives 10 times better energy resolution for the decay electrons than the liquid form. By using a TPC, it also provides a topological signature for the double-beta decay.

The big challenge was to find a way to amplify the charge in the xenon gas without inducing sparks. The solution came when JJ talked to David Nygren, inventor of the TPC at Berkeley. “It was one of those eureka moments,” he recalls. “Nygren proposed using electroluminescence, where you detect light emitted by ionization in a strong field near the anode. You can get 1000 UV photons for each electron. He immediately realized that we could get the resolution that way.” JJ then came up with an innovative scheme to detect those electrons in the tracking plane using light-detecting pixels (the silicon photomultiplier) – and the idea for NEXT was born. “It is hard for me not to believe in the goddess of physics,” says JJ. “Every time that I need help, she sends me an angel. It was Abe Seiden in California, Gary Feldman in Boston, Luigi di Lella and Ormundur Runolfson at CERN, Juan Antonio Rubio in Spain … and then Dave. Without him, I doubt NEXT would have ever materialized.” The collaboration now involves not only Spain and the US but also Colombia, Portugal and Russia. The generous help of a special Spanish funding programme, called CONSOLIDER-INGENIO, provided the necessary funds to get it going. “More angels came to help here,” he explains, “all of them theorists: José Manuel Labastida, at the time at the ministry of science, Álvaro de Rújula, my close friend Concepción González-García … really, the goddess gave us a good hand there.”

Despite the financial problems in Spain, JJ says that “there is a lot of good will” in MINECO, the Ministry of Economy, which currently handles science in Spain. He points out that there has already been a big investment in the experiment and that there is full support from the Canfranc Laboratory. He is particularly grateful for the “huge support and experience” of Alessandro Bettini, the former director of the Gran Sasso National Laboratory in Italy, who is now in charge at Canfranc. JJ finds Bettini and Nygren – both in their mid-seventies – inspirational characters, calling them the “Bob Dylans” of particle physics. Indeed, he set up an interview with both of them for the online cultural magazine, Jotdown – where he regularly contributes with a blog called “Faster than light”.

In many ways, JJ’s trajectory through particle physics is similar to that of any talented, energetic particle physicist pursuing his passion. So what about the novel? When did an interest in writing begin? JJ says that it goes back to when his family eventually settled in the town of Sagunto, near Valencia, when he was 15. An ancient city where modern steel-making stands alongside Roman ruins, he found it “a crucible of ideas”, where writers and artists mingled with the steel-workers, who wanted a more intellectual lifestyle for their children – especially after the return of democracy with the new constitution in 1978, following Franco’s death. JJ started writing poetry while studying physics in Sagunto, and when physics took him to SLAC in 1986, as a member of Stanford University, he was allowed to sit in on the creative-writing workshop. “I was not only the only non-native American but also the only physicist,” he recalls. “I’m not sure that they knew what to make of me.” Years later, he continued his formal education as a writer at the prestigious Escuela de Letras in Madrid.

A novel look at CERN

Around 2003, CERN was starting to become bigger news, with the construction of the LHC, experiments on antimatter and an appearance in Dan Brown’s mystery-thriller Angels & Demons. Having already written a book of short stories, La agonía de las libélulas (Agony of the dragonflies), published in 2000, JJ was approached by the Spanish publisher Espasa to write a novel that would involve CERN. Of course, the story would require action but it would also be a personal story, imbued with JJ’s love for the place. Materia Extraña, published in 2008, “deals with how someone from outside tries to come to grips with CERN,” he explains, “and also with the way that you do science.” It gives little away to say that at one and the same time it is CERN – but not CERN. For example, the director-general is a woman, with an amalgam of the characteristics that he observes to be necessary for women to succeed in physics. “The novel was presented in Madrid by Rubio,” says JJ. “At the time, we couldn’t guess he had not much time left.” (Rubio was to pass away in 2010.)

When asked by Espasa to write another book, JJ turned from fiction to fact and the issue of energy. Here he encountered “a kind of Taliban of environmentalism” and became determined to argue a more rational case. The result was El ecologista nuclear (The Nuclear Environmentalist, now published in English) in which he sets down the issues surrounding the various sources of energy. Comparing renewables, fossil fuels and nuclear power, he puts forward the case for an approach based on diversity and a mixture of sources. “The book created a lot of interest in intellectual circles in Spain,” he says. “For example, Carlos Martínez, who was president of CSIC and then secretary of state (second to the minister) liked it quite a bit. Cayetano López, now director of CIEMAT, and an authority in the field, was kind enough to present it in Madrid. It has made some impact in trying to put nuclear energy into perspective.”

So how does JJ manage to do all of this while also developing and promoting the NEXT experiment? “The trick is to find time,” he reveals. ‘We have no TV and I take no lunch, although I go for a swim.” He is also one of those lucky people who can manage with little sleep. “I write generally between 11 p.m. and 2 a.m.,” he explains, “but it is not like a mill. I’m very explosive and sometimes I go at it for 12 hours, non-stop.”

He is now considering writing about nuclear energy, along the lines of the widely acclaimed Sustainable Energy – without the hot air by Cambridge University physicist David MacKay, who is currently the chief scientific adviser at the UK’s Department of Energy and Climate Change. “The idea would be to give the facts without the polemic,” says JJ, “to really step back.” He has also been asked to write another novel, this time aimed at young adults, a group where publisher Espasa is finding new readers. While his son is only eight years old, his daughter is 12 and approaching this age group. This means that he is in touch with young-adult literature, although he finds that at present “there are too many vampires” and admits that he will be “trying to do better”. That he is a great admirer of the writing of Philip Pullman, the author of the bestselling trilogy for young people, His Dark Materials, can only bode well.

• For more about the NEXT experiment see the recent CERN Colloquium by JJ Gómez Cadenas at http://indico.cern.ch/conferenceDisplay.py?confId=225995. For a review of El ecologista nuclear see the Bookshelf section of this issue.

Work for the LHC’s first long shutdown gets under way

The LHC has been delivering data to the physics experiments since the first collisions in 2009. Now, with the first long shutdown, LS1, which started on 13 February, work begins to refurbish and consolidate aspects of the collider, together with the experiments and other accelerators in the injections chain.

LS1 was triggered by the need to consolidate the magnet interconnections so as to allow the LHC to operate at the design energy of 14 TeV in the centre-of-mass for proton–proton collisions. It has now turned into a programme involving all of the groups that have equipment in the accelerator complex, the experiments and the infrastructure systems. LS1 will see a massive programme of maintenance for the LHC and its injectors in the wake of more than three years of operation without the long winter shutdowns that were the norm in the past.

The main driving effort will be the consolidation of the 10,170 high-current splices between the superconducting magnets. As many as 1000–1500 splices will need to be redone and more than 27,000 shunts added to overcome possible problems with poor contacts between the superconducting cable and the copper stabilizer that led to the breakdown in September 2008.

The teams will start by opening up the interconnections between each of the 1695 main magnet cryostats. They will repair and consolidate around 500 interconnections at a time, in work that will gradually cover the entire 27-km circumference of the LHC. The effort on the LHC ring will also involve the exchange of 19 magnets, consolidation of the cryogenic feed boxes and installation of pressure-relief valves on the sectors that have not yet been equipped with them.

The Radiation to Electronics project (R2E) will see the protection of sensitive electronic equipment optimized by relocating the equipment or by adding shielding. Nor will work during LS1 be confined to the LHC. Major renovation work is scheduled, for example, for the Proton Synchrotron, the Super Proton Synchrotron and the LHC experiments.

Preparations for LS1 started more than three years ago, with the detailed planning of manpower and other resources. For example, Building 180 on the Meyrin site at CERN recently became a hive of activity as a training centre for the technicians who are implementing the various repairs and modifications. The pictures shown here give the flavour of this activity.

A view of the Large Magnet Facility
Plug-in modules
Welding
Workshops
The cutting tool

• More detailed articles on the work being done during LS1 will appear in the coming months. For news of the activities, watch out for articles in CERN Bulletin at http://cds.cern.ch/journal/CERNBulletin/2013/06/News%20Articles/?ln=en.

Spin physics in Dubna

SPIN 2012, the 20th International Symposium on Spin Physics, took place at the Joint Institute for Nuclear Research (JINR) in Dubna on 17–22 September. Around 300 participants attended from JINR and institutes in 22 countries (mainly Germany, Italy, Japan, Russia and the US). It consisted of a traditional mix of plenary and parallel sessions. Presentations covered the spin structure of hadrons, spin effects in reactions with lepton and hadron beams, spin physics beyond the Standard Model and future experiments, as well as the techniques of polarized beams and targets, and the application of spin phenomena in medicine and technology.

CCspi1_01_13

The symposium began with a focus on work at Dubna, starting with the unveiling of a monument to Vladimir Veksler, who invented the principle of phase stability (independently from Edwin McMillan in the US) and founded the 10 GeV Synchro-phasotron in Dubna in 1955. Talks followed about the future projects to be carried out at JINR’s newest facility, the Nuclotron-based Ion Collider fAcility (NICA). The complex will include an upgraded superconducting synchrotron, Nuclotron-M, with an area for fixed-target experiments, as well as a collider with two intersections for polarized protons (at 12 GeV per beam) or deuterons and nuclei (5 A GeV per beam). It will provide opportunities for a range of polarization studies to complement global data and will particularly help to solve the puzzles of spin effects that have been awaiting solutions since the 1970s. The spin community at the symposium supported the plans for these unique capabilities, and JINR’s director, Victor Matveev, announced that the project is ready to invite international nominations for leading positions in the spin programme at NICA.

The experimental landscape

In the US, Jefferson Lab’s programme of experiments on generalized parton distributions (GPDs) will be implemented with upgraded detectors and an increase in the energy of the Continuous Electron Beam Accelerator Facility from 6 GeV up to 12 GeV. The laboratory is also considering the construction of a new synchrotron to accelerate protons and nuclei up to 250 GeV before collision with 12 GeV electrons. In a similar way, a new 10–30 GeV electron accelerator is being proposed at Brookhaven National Laboratory to provide collisions between electrons and polarized protons and ions, including polarized 3He nuclei, at the Relativistic Heavy-Ion Collider (RHIC). The aim will be to investigate the spin structure of the proton and the neutron.

CCspi2_01_13

At CERN, the COMPASS-II project has been approved, firstly to study Drell-Yan muon-pair production in collisions of pions with polarized nucleons, to investigate the nucleon’s parton distribution functions (PDFs). A second aim is to study GPDs via the deeply virtual Compton-scattering processes of exclusive photon and meson production. The latter processes will provide the possibility for measuring the contribution of the orbital angular momenta of quarks and gluons to the nucleon spin. The Institute of High Energy Physics (IHEP), Protvino, has a programme at the U-70 accelerator for obtaining polarized proton and antiproton beams from Λ decay for spin studies at the SPASCHARM facility, which is currently under construction.

The participants heard with interest the plans to construct dedicated facilities for determining the electric dipole moment (EDM) of the proton and nuclei, with proposals by the Storage Ring EDM collaboration at Brookhaven and the JEDI collaboration at Jülich. The dipole moment of fundamental particles violates both parity and time-reversal invariance. Its detection would indicate the violation of the Standard Model and would, in particular, make it possible to approach the problem of understanding the baryon asymmetry of the universe. The proposed experiments would reduce the measurement limit on the deuteron EDM down to 10–29 e cm.

Classical experiments studying the nucleon spin structure at high energies use both lepton scattering on polarized nucleons (e.g. in HERMES at DESY, COMPASS and at Jefferson Lab) and collisions of polarized hadrons (at RHIC, IHEP and JINR). A unified description of these different high-energy processes is becoming possible within the context of QCD, the theory of strong interactions. Related properties, such as factorization, local quark–hadron duality and asymptotic freedom, allow the calculation of the characteristics of the process within the framework of perturbation theory. At the same time, PDFs, correlation and fragmentation functions are not calculable in perturbative QCD, but being universal they should be either parameterized and determined using various processes or calculated within some model approaches. A number of talks at the symposium were devoted to the development and application of such models.

Theory confronts experiment

Experiments involving spin have brought about the demise of more theories than any other single physical parameter. Modern theoretical descriptions of spin-dependent PDFs, especially those including the internal transverse-parton motion, were discussed at the symposium. In this case, the number of PDFs increases and the picture that is related to them loses – to a considerable degree – the simplicity of a parton model with its probabilistic interpretation. One of the difficulties here concerns how the PDFs evolve with a change in the wavelength of the probe particle. A new approach to solving this problem was outlined and demonstrated for the so-called Sivers asymmetry measured in data from the HERMES and COMPASS experiments (figure 1).

CCspi3_01_13

The helicity distributions of the quarks in a nucleon are the most thoroughly studied so far. The results of the most accurate measurements by COMPASS, HERMES and the CLAS experiment at Jefferson Lab were presented by the collaborations. The present-day experimental data are sufficiently precise to include them in QCD analysis. Two new alternative methods for the QCD analysis of deep-inelastic scattering (DIS) and semi-inclusive DIS (SIDIS) data allow a positive polarization of strange quarks to be excluded with a high probability. As for the gluon polarization, the results of its direct measurement by the COMPASS experiment, which are confirmed by the PHENIX and STAR experiments at RHIC, also agree with QCD analysis. The low value of gluon polarization indicates that its contribution to nucleon spin is not enough to resolve the so-called nucleon-spin crisis. Hopes to overcome this crisis are now connected to the possible contributions of the orbital angular momenta of quarks and gluons, to be measured from GPDs. There were talks on different theoretical aspects of GPDs, as well as experimental aspects of their measurement, in the context of the HERMES, CLAS and COMPASS experiments.

Other important spin distribution functions manifest themselves in the lepton DIS off transversely polarized nucleons. The processes in which the polarization of only one particle (initial or final) is known are especially interesting. However, although relatively simple from the point of view of the experiment, they are complicated from the theoretical point of view (such complementarities frequently occur). These single-spin asymmetries are related to T-odd effects, i.e. they seemingly break invariance with respect to time reversal. However, it is a case of “effective breaking” – that is, it is not related to a true non-invariance of a fundamental interaction (here, the strong interaction, described by QCD) with respect to time reversal but to its simulation by the effects of re-scattering in the final or initial states. The single asymmetries have been studied by theorists for more than 20 years. These studies have received a fresh impetus in recent years in connection with new experimental data on single-spin asymmetries in the semi-inclusive electroproduction of hadrons off longitudinally and transversely polarized and unpolarized nucleons.

Reports from the COMPASS collaborations on transverse-momentum-dependent (TMD) asymmetries were one of the highlights of the symposium. The experiment is studying as many as 14 different TMD asymmetries. Two of them, the Collins and Sivers asymmetries (figure 2) – which are responsible for the left–right asymmetries of hadrons in the fragmentation of transversely polarized quarks and quark distributions in transversely polarized nucleons – are now definitely established in the global analysis of all of the available data, although other TMD effects require further study. The results of studies of the transverse structure of the proton at Jefferson Lab were also presented at the symposium.

CCspi4_01_13

The PHENIX and STAR collaborations have new data on the single-spin asymmetries of pions and η-mesons produced in proton–proton collisions at 200 GeV per beam at RHIC, with one of the beams polarized and the other unpolarized. They observe amazingly large asymmetries in the forward rapidity region of the fragmenting polarized or unpolarized protons, with a fall to zero in the central rapidity region. A similar effect was observed earlier at Protvino and at Fermilab, but at lower energies, thus confirming energy independence (figure 3). In addition, there is no fall with rising transverse momentum in the values of the asymmetry measured at RHIC. The particular mechanism for these asymmetries remains a puzzle so far.

So although single-spin asymmetries on the whole are described by existing theory, developments continue. The T-odd distribution functions involved lose the key property of universality and become “effective”, that is, dependent on the process in which they are observed. In particular, the most fundamental QCD prediction is the change of sign of the Sivers PDF determined from SIDIS processes and from Drell-Yan pair-production on a transversely polarized target. This prediction is to be checked by the COMPASS-II experiment as well as at RHIC, NICA and in the PANDA and PAX experiments at the Facility for Antiproton and Ion Research.

New data from Jefferson Lab on measurements of the ratio of the proton’s electric and magnetic form factors performed by the technique of recoil polarization gave rise to significant interest and discussions at the symposium. The previous measurements from Jefferson Lab showed that this ratio is not constant, as had been suggested for a long time, but decreases linearly with increasing momentum transfer, Q2 – the so-called “form factor crisis”. New data from the GEp(III) experiment indicate a flattening of this ratio in the region of Q2 = 6–8 GeV2. The question of whether this behaviour is a result of an incomplete calculation of radiative corrections – in particular, two-photon exchange – remains open.

The symposium enjoyed hearing the first results related to spin physics from experiments at CERN’s LHC. In particular, many discussions focused on the role of spin in investigating the recently discovered particle with a mass of 125 GeV, which could be the Higgs boson, as well as in studies of the polarization of W and Z bosons, and in heavy-quark physics. A number of talks were dedicated to the opportunities for theory related to searches for the Z’ and other exotics at the LHC and the future electron–positron International Linear Collider.

On the technical side there was confirmation of the method of obtaining the proton-beam polarization at the COSY facility in Jülich by spin filtration in the polarized gas target. This method can also be used for polarization of an antiproton beam, which will be important for measurements of different spin distributions in the nucleon via Drell-Yan muon-pair production in polarized proton–antiproton collisions in the PANDA and PAX experiments. There were also discussions on sources of polarized particles, the physics of polarized-beam acceleration, polarimeters and polarized-target techniques. In addition, there were reports on applications of hyperpolarized 3He and 19F in different fields of physics, applied science and medicine.

The main results of the symposium were summarized in an excellent concluding talk by Franco Bradamante from Trieste. The proceedings will be published in special volumes of Physics of Elementary Particles and Atomic Nuclei. The International Committee on Spin Physics, which met during the symposium, emphasized the excellent organization and success of the meeting in Dubna and decided that the 21st Symposium of Spin Physics will take place in Beijing in September 2014.

The incomprehensibility principle

Educators and psychologists invented the term “attention span” to describe the length of time anyone can concentrate on a particular task before becoming distracted. It is a useful term but span, or duration, is only one aspect of attention. Attention must also have an intensity – and the two variables are independent of each other. Perhaps one can postulate an analogue of the Heisenberg uncertainty principle, in which the intensity of attention multiplied by its span cannot exceed some fixed value. I call this the “incomprehensibility principle” and I have had plenty of opportunities to observe its consequences.

CCvie1_01_13

In the hands of skilled presenters, information can be carefully packaged as entertainment so that the attention needed to digest it is minimal. The trick is to mask the effort with compelling emotional appeal and a floppy boy-band haircut. However, the need to pay attention is still there; in fact, absorbing even the most trivial information demands a modicum of attention. How many of us, when leaving a cinema, have had the nagging feeling that although the film made great entertainment some details of the plot remained less than crystal clear?

The existence of a minimum level of attention suggests that it is, in some sense, a quantum substance. This means that under close examination, any apparently continuous or sustained effort at paying attention will be revealed as a series of discrete micro-efforts. However, while attention can be chopped up and interleaved with other activities, even tiny pulses of attention demand full concentration, to the exclusion of all other voluntary activities. Any attempt at multitasking, such as using a mobile phone while driving a car, is counterproductive.

The incomprehensibility principle plays a major role in education, where it is closely linked to the learning process. Because of the subject matter and/or the teacher, some school lessons require more time to assimilate than others. This trend accelerates in higher education. In my case, a hint of what was to come appeared during my third year of undergraduate physics, when I attended additional lectures on quantum mechanics in the mathematics department at Imperial College London.

My teacher was Abdus Salam, who went on to share the Nobel Prize for Physics in 1979. Salam’s lectures were exquisitely incomprehensible; as I look back, I realize he was probably echoing his own experiences at Cambridge some 15 years earlier at the hands of Paul Dirac. But he quickly referred us to Dirac’s book, The Principles of Quantum Mechanics. At a first and even a second glance, this book shone no light at all but after intense study, a rewarding glimmer of illumination appeared out of the darkness.

Motivated by Salam’s unintelligibility, I began postgraduate studies in physics only to find that my previous exposure to incomprehensibility had been merely an introduction. By then, there were no longer any textbooks to fall back on and journal papers were impressively baffling. With time, though, I realized that – like Dirac’s book – they could be painfully decrypted at “leisure”, line by line, with help from enlightened colleagues.

The real problem with the incomprehensibility principle came when I had to absorb information in real time, during seminars and talks. The most impenetrable of these talks always came from American speakers because they were, at the time, wielding the heavy cutting tools at the face of physics research. Consequently, I developed an association between incomprehensibility and accent. This reached a climax when I visited the US, where I always had the feeling that dubious characters hanging out at bus stations and rest stops must somehow be experts in S-matrix theory and the like, travelling from one seminar to the next. Several years later, when I was at CERN, seminars were instead delivered in thick European accents and concepts such as “muon punch-through” became more of an obstacle when pointed out in a heavy German accent.

Nevertheless, I persevered and slowly developed new skills. The incomprehensibility principle cannot be bypassed but even taking into account added difficulties such as the speaker’s accent or speed of delivery – not to mention bad acoustics or poor visual “aids” – it is still possible to optimize one’s absorption of information.

One way of doing this is to monitor difficult presentations in “background mode”, paying just enough attention to follow the gist of the argument until a key point is about to be reached. At that moment, a concerted effort can be made to grab a vital piece of information as it whistles past, before it disappears into the obscurity of the intellectual stratosphere. The trick is to do this at just the right time, so that each concentrated effort is not fruitless. “Only cross your bridges when you come to them”, as the old adage goes.

By adopting this technique, I was able to cover frontier meetings on subjects of which I was supremely ignorant, including microprocessors, cosmology and medical imaging, among others. Journalists who find themselves baffled at scientific press conferences would do well to follow my example, for the truth is that there will always be a fresh supply of incomprehensibility in physics. Don’t be disappointed!

Gordon Fraser. Gordon, who was editor of CERN Courier for many years, wrote this as a ‘Lateral Thought’ for Physics World magazine but died before the article could be revised (see obituary). It was completed by staff at Physics World and is published in both magazines this month as a tribute.

Marking the end of the first proton run

At 6 a.m. on 17 December, operators ended the LHC’s first three-year-long run for proton physics with a new performance milestone. In the preceding days, the space between proton bunches had been successfully halved to the design specification of 25 ns rather than the 50 ns used so far.

Halving the bunch spacing allowed the number of bunches in the machine to be doubled, resulting in a record number of 2748 bunches in each beam; previously the LHC had been running with around 1380 bunches per beam. This gave a record beam intensity of 2.7 × 1014 protons in both beams at the injection energy of 450 GeV.

The LHC operations team then performed a number of ramps taking 25 ns beams from 450 GeV to 4 TeV, increased the total number of bunches at each step to a maximum of 804 per beam. The stepwise approach is needed to monitor the effects of additional electron cloud produced when synchrotron radiation emitted by the protons strikes the vacuum chamber – the synchrotron-radiation photon flux increases significantly as the energy of the protons is increased.

Electron cloud is strongly enhanced by the reduced spacing between bunches and is one of the main limitations for 25 ns operation. It has negative effects on the beam (increasing beam size and losses), the cryogenics (in the heat load on the beam pipe) and the vacuum (pressure rise). As a result, a period of beam-pipe conditioning known as “scrubbing” was needed before ramping the beams. During this period, the machine was operated in a controlled way with beams of increasingly high intensity. This helps to improve the surface characteristics of the beam pipe and reduces the density of the electron cloud. Once each beam had been ramped to 4 TeV, a pilot physics run of several hours took place with up to 396 bunches, spaced at 25 ns, in each beam. Although the tests were successful, significantly more scrubbing will be required before the full 25 ns beam can be used operationally.

While these tests were taking place, on 13 December representatives of the LHC and five of its experiments delivered a round-up report to CERN Council. All of the collaborations congratulated the LHC team on the machine’s exemplary performance over the first three full years of running. In 2012, not only did the collision energy increase from 7 TeV to 8 TeV but the instantaneous luminosity reached 7.7 × 1033 cm–2 s–1, more than twice the maximum value obtained in 2011 (3.5 × 1033 cm–2 s–1). News from the experiments included LHCb’s measurement of the decay of the Bs meson into two muons (Bs → μμ seen after being sought for decades), ALICE’s detailed studies of the quark–gluon plasma and TOTEM’s insights on the structure of the proton.

ATLAS and CMS gave updates on the Higgs-like particle first announced in July, with each experiment now observing the new particle with a significance close to 7σ, well beyond the 5σ required for a discovery. So far, the particle’s properties seem consistent with those of a Standard Model Higgs boson. The two collaborations are, however, careful to say that further analysis of the data – and a probable combination of both experiments’ data next year – will be required before some key properties of the new particle, such as its spin, can be determined conclusively. The focus of the analysis has now moved from discovery to measurement of the new particle in its individual decay channels.

With December 2012 marking the end of the first LHC proton physics running period, 2013 sees a four-week run from mid-January to mid-February for proton–lead collisions before going into a long shut-down for consolidation and maintenance until the end of 2014. Running will resume in 2015 at an increased collision energy of 13 TeV.

CERN becomes UN observer

CCnew2_01_13

On 14 December, the UN General Assembly adopted a resolution to allow CERN to participate in the work of the General Assembly and to attend its sessions as an observer. With this new status, the laboratory can promote the essential role of basic science in development.

In a meeting with UN secretary-general, Ban Ki-moon, on 17 December, CERN’s director-general, Rolf Heuer, pledged that CERN was willing to contribute actively to the UN’s efforts to promote science, in particular UNESCO’s initiative “Science for sustainable development”. Ban Ki-moon, left, with Rolf Heuer.

BOSS gives clearer view of baryon oscillations

CCnew3_01_13

In November the Baryon Oscillation Spectroscopic Survey (BOSS) released its second major result of 2012, using 48,000 quasars with redshifts (z) up to 3.5 as backlights to map intergalactic hydrogen gas in the early universe for the first time, as far back as 11,500 million years ago.

As the light from each quasar passes through clouds of gas on its way to Earth, its spectrum accumulates a thicket of hydrogen absorption lines, the “Lyman-alpha forest”, whose redshifts and prominence reveal the varying density of the gas along the line of sight. BOSS collected enough close-together quasars to map the distribution of the gas in 3D over a wide expanse of sky.

The largest component of the third Sloan Digital Sky Survey, BOSS measures baryon acoustic oscillations (BAO) – recurring peaks of matter density that are most evident in net-like strands of galaxies. Initially imprinted in the cosmic microwave background radiation, BAO provide a ruler for measuring the universe’s expansion history and probing the nature of dark energy.

In March 2012, BOSS released its first results on more than 350,000 galaxies up to z = 0.7, or 7000 million years ago. However, only quasars are bright enough to probe the gravity-dominated early universe when expansion was slowing, well before the transition to the present, where dark energy dominates and expansion is accelerating. When complete, BOSS will have surveyed 1.5 million galaxies and 160,000 quasars.

To resolve the nature of dark energy will need even greater precision. The BigBOSS collaboration, which, like BOSS, is led by scientists at Lawrence Berkeley National Laboratory (LBNL), proposes to modify the 4-m Mayall Telescope to survey 24 million galaxies to z = 1.7, plus two million quasars to z = 3.5. The Gordon and Betty Moore Foundation recently awarded a grant of $2.1 million to help fund the spectrograph and corrector optics, two key BigBOSS technologies.

Europe launches consortium for astroparticle physics

At the end of November, European funding agencies for astroparticle physics launched a new sustainable entity, the Astroparticle Physics European Consortium (APPEC). This will build on the successful work of the European-funded network, the AStroParticle European Research Area (ASPERA).

Over the past six years, ASPERA has brought together funding agencies and the physics community to set up European co-ordination for astroparticle physics. It has developed common R&D calls and created closer relationships to industry and other research fields. Above all, ASPERA has developed a European strategy for astroparticle physics to prioritize the large infrastructures needed to solve universal mysteries in concerning, for example, neutrinos, gravitational waves, dark matter and dark energy.

APPEC now plans to develop a European common action plan to fund the upcoming large astroparticle-physics infrastructures as defined in ASPERA’s road map. Ten countries have already joined the new APPEC consortium, with nine others following the accession process. APPEC’s activities will be organized through three functional centres, located at DESY, the Astronomy, Particle Physics and Cosmology laboratory of the French CNRS/CEA, and the INFN’s Gran Sasso National Laboratory. Stavros Katsanevas of CNRS has been elected as chair of APPEC and Thomas Berghoefer of DESY as general secretary.

• APPEC is the Astroparticle Physics European Consortium. It currently comprises 10 countries represented by their Ministries, funding agencies or their designated institution: Belgium (FWO), Croatia (HRZZ), France (CEA, CNRS), Germany (DESY), Ireland (RIA), Italy (INFN), The Netherlands (FOM), Poland (NCN), Romania (IFIN) and the UK (STFC).

ATLAS enlists monojets in search for new physics

Events with a single jet of particles in the final state have traditionally been studied in the context of searches for supersymmetry, for large extra spatial dimensions and for candidates for dark matter. Having searched for new phenomena in monojet final states in the 2011 data, the ATLAS collaboration turned its attention to data collected in 2012, with the first results presented at the Hadron Collider Physics (HCP) symposium in Kyoto in November.

CCnew5_01_13

Models with large extra spatial dimensions aim to provide a solution to the mass-hierarchy problem (related to the large difference between the electroweak unification scale at around 102 GeV and the Planck scale around 1019 GeV) by postulating the presence of n extra dimensions, such that the Planck scale in 4+n dimensions becomes naturally close to the electroweak scale. In these models, gravitons (the particles hypothesized as mediators of the gravitational interaction) are produced in association with a jet of hadrons; the extremely weakly interacting gravitons would escape detection, leading to a monojet signature in the final state.

CCnew6_01_13

Dark-matter particles could also give rise to monojet events. According to the current understanding of cosmology, non-baryonic non-luminous matter contributes about 23% of the total mass-energy budget of the universe but the exact nature of this dark matter remains unknown. A commonly accepted hypothesis is that it consists of weakly interacting massive particles (WIMPs) acting through gravitational or weak interactions. At the LHC, WIMPs could be produced in pairs that would pass through the experimental devices undetected. Such events could be identified by the presence of an energetic jet from initial-state radiation, leading again to a monojet signature. The LHC experiments have a unique sensitivity for dark-matter candidates with masses below 4 GeV and are therefore complementary to other searches for dark matter.

The study presented at HCP uses 10 fb–1 of proton–proton data collected during 2012, at a centre-of-mass energy of 8 TeV. As with the earlier analysis, the results are still in good agreement with the predictions of the Standard Model (figure 2). The new results have been translated into updated exclusion limits on the presence of large extra spatial dimensions and the production of WIMPs, as well as new limits on the production of gravitinos (the supersymmetric partners of gravitons) that result in the best lower bound to date on the mass of the gravitino.

Bs → μμ seen after being sought for decades

Bs → μμ

It has taken decades of hunting but finally the first evidence for one of the rarest particle decays ever seen in nature, the decay of a Bs (composed of a beauty antiquark and a strange quark) into two muons, has been uncovered by the LHCb collaboration.

In the Standard Model, the decay Bs → μμ is calculated to occur only three times in every 1000 million Bs decays. While the Standard Model has been incredibly successful, it leaves many unanswered questions concerning, for example, the origin of the matter–antimatter asymmetry and the essence of dark matter. Extended theories, such as supersymmetry, may resolve some of these issues. These theories allow for new particles and phenomena that can affect measurable quantities. The branching fraction B(Bs → μμ), for example, can be enhanced or reduced with respect to the Standard Model prediction, so the measurement has the potential to reveal hints of new physics. The LHCb experiment is particularly suited for such an indirect search for the effects of new physics, complementary to direct searches for new particles.

The LHCb collaboration performed the search for Bs → μμ (and B0 → μμ) by analysing 1.0 fb–1 of proton–proton collisions at 7 TeV in the centre of mass (from 2011) and 1.1 fb–1 at 8 TeV (2012). The signal selection starts with the search for pairs of oppositely charged muons that make a vertex that is displaced from the proton–proton interaction vertex (see figure 1). The signal and background are then separated using simultaneously the invariant mass of the two muons as well as kinematic and topological information combined in a multivariate analysis classifier. The particular classifier used is a boosted decision-tree (BDT) algorithm, which is calibrated with data for both signal and background events. The latter are dominated by random combinations of two muons from two different B mesons; this contribution is carefully determined from data.

Invariant mass distribution

The number of B0 → μμ candidates that LHCb observes is consistent with the background expectation, giving an upper limit of B(B0 → μμ) < 9.4 × 10–10 at 95% confidence level. This is the world’s most stringent upper limit from a single experiment on this branching fraction. However, for Bs → μμ, LHCb sees an excess of candidates with respect to the background expectation (figure 2). A maximum-likelihood fit gives a branching fraction of B(Bs → μμ) = 3.2 +1.5–1.2 × 10–9. The probability that the background could produce an excess of this size or larger is 5.3 × 10–4, corresponding to a signal significance of 3.5σ.

The measurement of Bs → μμ is close to the Standard Model prediction, albeit with a large uncertainty. This eagerly awaited result was presented at the Hadron Collider Physics Symposium in Kyoto and at a CERN seminar, and is now published. While it does not provide evidence for supersymmetry, it does constrain the parameter space for this and other models of new physics, and is a step further in understanding the universe.

bright-rec iop pub iop-science physcis connect