By Mohammad Saleem and Muhammad Rafique CRC Press/Taylor and Francis
Hardback: £44.99
Although group theory has played a significant role in the development of various disciplines of physics, there are few recent books that start from the beginning and then go on to consider applications from the point of view of high-energy physicists. Group Theory for High-Energy Physicists aims to fill that role. The book first introduces the concept of a group and the characteristics that are imperative for developing group theory as applied to high-energy physics. It then describes group representations and, with a focus on continuous groups, analyses the root structure of important groups and obtains the weights of various representations of these groups. It also explains how symmetry principles associated with group theoretical techniques can be used to interpret experimental results and make predictions. This concise introduction should be accessible to undergraduate and graduate students in physics and mathematics, as well as to researchers in high-energy physics.
By Chun Wa Wong Oxford University Press
Hardback: £45 $84.95
Introduction to Mathematical Physics explains how and why mathematics is needed in the description of physical events in space. Aimed at physics undergraduates, it is a classroom-tested textbook on vector analysis, linear operators, Fourier series and integrals, differential equations, special functions and functions of a complex variable. Strongly correlated with core undergraduate courses on classical and quantum mechanics and electromagnetism, it helps students master these necessary mathematical skills but also contains advanced topics of interest to graduate students. It includes many tables of mathematical formulae and references to useful materials on the internet, as well as short tutorials on basic mathematical topics to help readers refresh their knowledge. An appendix on Mathematica encourages the reader to use computer-aided algebra to solve problems in mathematical physics. A free Instructor’s Solutions Manual is available to instructors who order the book.
By Milutin Blagojevićand Friedrich W Hehl (eds.) World Scientific
Hardback: £111 $168 S$222
With a foreword by Tom Kibble and commentaries by Milutin Blagojević and Friedrich W Hehl, the aim of this volume is to introduce graduate and advanced undergraduate students of theoretical or mathematical physics – and other interested researchers – to the field of classical gauge theories of gravity. Intended as a guide to the literature in this field, it encourages readers to study the introductory commentaries and become familiar with the basic content of the reprints and the related ideas, before choosing specific reprints and then returning to the text to focus on further topics.
By Ian D Lawrie CRC Press/Taylor and Francis
Paperback: £44.99
A Unified Grand Tour of Theoretical Physics invites readers on a guided exploration of the theoretical ideas that shape contemporary understanding of the physical world at the fundamental level. Its central themes – which include space–time geometry and the general relativistic account of gravity, quantum field theory and the gauge theories of fundamental forces – are developed in explicit mathematical detail, with an emphasis on conceptual understanding. Straightforward treatments of the Standard Model of particle physics and that of cosmology are supplemented with introductory accounts of more speculative theories, including supersymmetry and string theory. This third edition includes a new chapter on quantum gravity and new sections with extended discussions of topics such as the Higgs boson, massive neutrinos, cosmological perturbations, dark energy and dark matter.
By Anton Rebhan, Ludmil Katzarkov, Johanna Knapp, Radoslav Rashkov and Emanuel Scheidegger (eds.) World Scientific
Hardback: £104
E-book: £135
This book contains invited contributions from collaborators of Maximilian Kreuzer, a well known string theorist who built a sizeable group at Vienna University of Technology (TU Vienna) but sadly died in November 2010 aged just 50. Victor Batyrev, Philip Candelas, Michael Douglas, Alexei Morozov, Joseph Polchinski, Peter van Nieuwenhuizen and Peter Wes are among others giving accounts of Kreuzer’s scientific legacy and original articles. Besides reviews of recent progress in the exploration of string-theory vacua and corresponding mathematical developments, Part I reviews in detail Kreuzer’s important work with Friedemann Brandt and Norbert Dragon on the classification of anomalies in gauge theories. Similarly, Part III contains a user manual for a new thoroughly revised version of PALP (Package for Analysing Lattice Polytopes with applications to toric geometry), the software developed by Kreuzer and Harald Skarke at TU Vienna.
By Alexander W Chao and Weiren Chou (ed.) World Scientific
Hardback: £111
E-book: £144
Of about 30,000 accelerators at work in the world today, a majority of these are for applications in industry. This volume of Reviews of Accelerator Science and Technology contains 14 articles on such applications, all by experts in their respective fields. The first eight articles review various applications, from ion-beam analysis to neutron generation, while the next three discuss accelerator technology that has been developed specifically for industry. The twelfth article tackles the challenging subject of future prospects in this rapidly evolving branch of technology. Last, the volume features an article on the success story of CERN by former director-general, Herwig Schopper, as well as a tribute to Simon van der Meer, “A modest genius of accelerator science”.
The Compact Linear Collider (CLIC) and the International Linear Collider (ILC) – two studies for next-generation projects to complement the LHC – now belong to the same organization. The Linear Collider Collaboration (LCC) was officially launched on 21 February at TRIUMF, Canada’s national laboratory for particle and nuclear physics.
The ILC and CLIC have similar physics goals but use different technologies and are at different stages of readiness. The teams working on them have now united in the new organization to make the best use of the synergies between the two projects and to co-ordinate and advance the global development work for a future linear collider. Lyn Evans, former project leader of the LHC, heads the LCC, while Hitoshi Murayama, director of the Kavli Institute for the Physics and Mathematics of the Universe, is deputy-director.
The LCC has three main sections, reflecting the three areas of research that will continue to be conducted. Mike Harrison of Brookhaven National Laboratory leads the ILC section, Steinar Stapnes of CERN leads the CLIC section and Hitoshi Yamamoto of Tohoku University leads the section for physics and detectors. The Linear Collider Board (LCB), with the University of Tokyo’s Sachio Komamiya at the head, is a new oversight committee for the LCC. Appointed by the International Committee for Future Accelerators, the LCC met for the first time at TRIUMF in February. The ILC’s Global Design Effort and its supervisory organization, the ILC Steering committee, officially handed over their duties to the LCC and LCB in February but they will continue to work together until the official completion of the Technical Design Report for the ILC.
Both the ILC and CLIC will continue to exist and carry on their R&D activities – but with even more synergy between common areas. These include the detectors and the planning of infrastructure, as well as civil-engineering and accelerator aspects. The projects are at different stages of maturity. The CLIC collaboration published its Conceptual Design Report in 2012 and is scheduled to complete the Technical Design Report, which demonstrates feasibility for construction, in a couple of years.
For the ILC collaboration, which will publish its Technical Design Report in June this year, the main focus is on preparing for possible construction while at the same time further advancing acceleration technologies, industrialization and design optimization. The final version of the report will include a new figure for the projected cost. The current estimate is 7.8 thousand million ILC Units (1 ILC unit is equivalent to US$1 of January 2012), plus an explicit estimate for labour costs averaged over the three regional sites, amounting to 23 million person-hours. With the finalization of the Technical Design Report, the ILC’s Global Design Effort, led by Barry Barish, will formally complete its mandate.
With the discovery of the Higgs-like boson at the LHC, the case for a next-generation collider in the near future has received a boost and researchers are thinking of ways to build the linear collider in stages: first as a so-called Higgs factory for the precision studies of the new particle; second at an energy of 500 GeV; and third, at double this energy, to open further possibilities for as yet undiscovered physics phenomena. Japan is signalling interest to host the ILC.
“Now that the LHC has delivered its first and exciting discovery, I am eager to help the next project on its way,” says Evans. “With the strong support the ILC receives from Japan, the LCC may be getting the tunnelling machines out soon for a Higgs factory in Japan while at the same time pushing frontiers in CLIC technology.”
When the LHC and injector beams stopped on 16 February, the following words appeared on LHC Page 1: “No beam for a while. Access required: Time estimate ˜2 years”. This message marked the start of the first long shutdown (LS1). Over the coming years, major maintenance work will be carried out across the whole of CERN’s accelerator chain. Among the many tasks foreseen, more than 10,000 LHC magnet interconnections will be consolidated and the entire ventilation system for the 628-m-circumference Proton Synchrotron will be replaced, as will more than 100 km of cables on the Super Proton Synchrotron. The LHC is scheduled to start up again in 2015, operating at its design energy of 7 TeV per beam, with the rest of the CERN complex restarting in the second half of 2014.
The LHC’s first dedicated proton–lead run came to an end on 10 February, having delivered an integrated luminosity of more than 30 nb–1 to ALICE, ATLAS and CMS and 2.1 nb–1 to LHCb, with the TOTEM, ALFA and LHCf experiments also taking data. This run had ended later than planned because of challenges that had arisen in switching the directions of the two beams; as a result the 2013 operations were extended slightly to allow four days of proton–proton collisions at 1.38 TeV. To save time, these collisions were performed un-squeezed. After set up, four fills with around 1300 bunches and a peak luminosity of 1.5 × 1032 cm–2 s–1 delivered around 5 pb–1 of data to ATLAS and CMS. The requisite luminosity scans were somewhat hampered by technical issues but succeeded in the end, leaving just enough time for a fast turnaround and a short final run at 1.38 TeV for ALFA and TOTEM.
On 14 February, the shift crew dumped the beams from the LHC to bring to an end the machine’s first three-year physics run. Two days of quench tests followed immediately to establish the beam loss required to quench the magnets. Thanks to these tests, it will be possible to set optimum thresholds on the beam-loss monitors when beams circulate again in 2015.
Despite no beam from 16 February onwards, the LHC stayed cold until 4 March so that powering tests could verify the proper functioning of the LHC’s main magnet (dipole and quadrupole) circuits. At the same time, teams in the CERN Control Centre performed extensive tests of all of the other circuits, up to current levels corresponding to operation with 7 TeV beams. By powering the entire machine and then going sector by sector, the operators managed to perform more than a thousand tests on 540 circuits in just 10 days. Small issues were resolved by immediate interventions and the operators identified a number of circuits that need a more detailed analysis and possibly intervention during LS1.
With powering tests complete, the Electrical Quality Assurance team could test the electrical insulation of each magnet, sector by sector, before the helium was removed and stored. Beginning with sector 5–6, the magnets are now being warmed up carefully and the entire machine should be at room temperature by the end of May.
On the same day that the LHC’s first three-year physics run ended, CERN announced that its data centre had recorded more than 100 petabytes (PB) – 100 million gigabytes – of physics data.
Amassed over the past 20 years, the storing of this 100 PB – the equivalent of 700 years of full HD-quality video – has been a challenge. At CERN, the bulk of the data (about 88 PB) is archived on tape using the CERN Advanced Storage (CASTOR) system. The rest (13 PB) is stored on the EOS-disk pool system, which is optimized for fast analysis access by many concurrent users.
For the CASTOR system, eight robotic tape libraries are distributed across two buildings, with each tape library capable of containing up to 14,000 tape cartridges. CERN currently has around 52,000 tape cartridges with a capacity ranging from 1 terabyte (TB) to 5.5 TB each. For the EOS system, the data are stored on more than 17,000 disks attached to 800 disk servers.
Not all of the data are generated by LHC experiments. CERN’s IT Department hosts data from many other high-energy physics experiments at CERN, past and present, and is also a data centre for the Alpha Magnetic Spectrometer.
For both tape and disk, efficient data storage and access must be provided, and this involves identifying performance bottlenecks and understanding how users want to access the data. Tapes are checked regularly to make sure that they stay in good condition and are accessible to users. To optimize storage space, the complete archive is regularly migrated to the newest high-capacity tapes. Disk-based systems are replicated automatically after hard-disk failures and a scalable namespace enables fast concurrent access to millions of individual files.
The data centre will keep busy during the long shutdown of the whole accelerator complex, analysing data taken during the LHC’s first three-year run and preparing for the higher expected data flow when the accelerators and experiments start up again. An extension of the centre and the use of a remote data centre in Hungary will further increase the data centre’s capacity.
Using two independent analyses, the LHCb collaboration has updated its measurement of ΔACP, the difference in CP violation in the decays D0→K+K– and D0→π+π–. This helps to cast light on the whether – and to what extent – CP violation occurs in interactions involving particles, such as the D0, that contain a charm quark.
The new results represent a significant improvement in the measurement of ΔACP, which has emerged as an important means to probe the charm sector. A previous measurement from LHCb was 3.5σ from zero and constituted the first evidence for CP violation in the charm sector (LHCb 2012). Subsequent results from the CDF and Belle collaborations, at Fermilab and KEK, respectively, further strengthened the evidence but not to the 5σ gold standard. Because the size of the effect was larger than expected, the result provoked a flurry of theoretical activity, including new physics models that could enhance such asymmetries and ideas for measurements that could elucidate the origin of the effect.
Both of the new measurements by LHCb use the full 2011 data set, corresponding to an integrated luminosity of 1.0 fb–1 of proton–proton collisions at 7 TeV in the centre of mass. The first uses the same “tagging” technique as all previous measurements, in which the initial flavour of the D meson (D0 or D–0) is inferred from the charge of the pion in the decay D*+→D0 π+. The second uses D mesons produced in semimuonic B decays, where the charge of the associated muon provides the tag. The two methods allow for useful cross-checks, in particular for biases that have different origins in the two analyses.
Compared with LHCb’s previous publication on ΔACP, the new pion-tagged analysis uses more data, fully reprocessed with improved alignment and calibration constants (LHCb 2013a). The most important change in the analysis procedure is that a vertex constraint has been applied to achieve a factor 2.5 better in background suppression. The result, ΔACP = (–0.34 ± 0.15 (stat.) ± 0.10 (syst.))%, is closer to zero than the previous measurement, which it supersedes. Detailed investigations reveal that the shift caused by each change in the analysis is consistent with a statistical fluctuation.
To add to the picture, the muon-tagged analysis also measures a value that is consistent with zero: ΔACP = (+0.49 ± 0.30 (stat.) ± 0.14 (syst.))% (LHCb 2013b). In both analyses, the control of systematic uncertainties around the per mille level is substantiated by numerous cross-checks. As the figure shows, the two new results are consistent with each other and with other results at the 2σ level but do not confirm the previous evidence of CP violation in the charm sector.
Theoretical work has shown that several well motivated models could induce large CP-violation effects in the charm sector. These new results constrain the parameter space of such models. Further updates to this and to related measurements will be needed to discover if – and at what level – nature distinguishes between charm and anticharm. The full data sample recorded by LHCb until the start of the long shutdown contains more than three times the number of charm decays analysed in these new analyses, so progress can be anticipated during the LHC long shutdown.
To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
Functional
Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.