Comsol -leaderboard other pages

Topics

Carlo Rubbia: a passion for physics and a craving for new ideas

With CERN as his scientific home since 1961, Carlo Rubbia is unique in the organization. One of the three Nobel laureates who received their prizes for research done at the laboratory, he was also director-general from 1989 to 1993 – during the crucial years when the ground for the future LHC was prepared. Rubbia’s fame, both at CERN and worldwide, is related closely to his work in the early 1980s, when the conversion of the Super Proton Synchrotron (SPS) to a proton–antiproton collider led to the discovery of the W and Z bosons. The Nobel Prize in Physics was awarded in 1984 jointly to Rubbia and Simon van der Meer “for their decisive contributions to the large project, which led to the discovery of the W and Z field particles, communicators of weak interaction”.

However, Rubbia is much more than an exceptional CERN physicist awarded a Nobel prize, who also became director-general. As a child he was forced to flee his home in Gorizia in north-eastern Italy during the Second World War. He went on to become a brilliant physics student at Scuola Normale in Pisa, after almost taking up engineering; a scientist who continues to push the frontiers of knowledge; a volcano of ideas whose inextinguishable fuel is his relentless curiosity and vision; and a courageous citizen of the world, convinced of the duty of science to find solutions to today’s global emergencies. In August 2013, the president of the Italian Republic, Giorgio Napolitano, recognized Rubbia’s contribution to the history and prestige of CERN, and to a field “vital to our country”, as he put it, when he appointed Rubbia “senator for life” (CERN Courier November 2013 p37).

The path that would take Carlo Rubbia to Stockholm for the Nobel prize started in the Scuola Normale in Pisa, just one year before CERN was founded, in September 1953. There he chose physics against the will of his parents.
My family would have preferred that I took engineering, but I wanted to study physics. So we agreed that if I passed the entrance exams for the Scuola Normale in Pisa, I could study physics there, otherwise I would have to do engineering. There were only 10 places for Pisa, and I was ranked 11th, so I lost – and I started engineering at Milan. Luckily an unknown student among the first 10 in Pisa (whom I’d be curious to meet one day) gave up and left a place open to the next applicant on the waiting list. So, three months later, I was in Pisa, studying physics, and I stayed there and had a lot of fun.

It’s not unusual to hear research physicists speak about their job in terms of “fun”. In Rubbia’s case, physics is still a huge part of his life and is a real passion. Why is this?
“New” is the keyword. Discovering something new creates alternatives, generates interest and fuels the world. The future is not repeating what you’ve done in the past. Innovation, driven by curiosity, the desire to find out something new, is one of the fundamental attributes of mankind. We did not go to the Moon because of wisdom, but because of curiosity. This is part of human instinct, it is true for all civilizations, and is unavoidable.

After obtaining his degree from Pisa in 1957 in record time – three years, including the doctoral thesis – Rubbia spent a year on research at Columbia University in the US, followed by a year at La Sapienza university in Rome as assistant professor to Marcello Conversi – whom he remembers as “a great friend in addition to being a great mentor, someone with whom the transition from student to colleague happened very smoothly” – before landing at CERN in 1961.
I thought Europe was the real future for someone who wanted to do research. Europe needed to resurrect in science in general, and physics in particular. [After the US] my interest was not in going back to Italy, but rather to go back to Europe. So I left Rome after a year of teaching and went to CERN.

When Rubbia arrived, CERN’s active laboratory life was just beginning.
When I arrived, none of the buildings you see today were there. There was no cafeteria. We took coffee and our meals at a restaurant near the airport [Le Café de l’Aviation] and the University of Geneva offered us space to carry out our work while CERN was under construction. In this group of CERN pioneers there were also two others who received the Nobel prize: Simon van der Meer and Georges Charpak. All three of us were among the few who have experienced CERN since its early days.

CERN was the stage for Rubbia’s greatest scientific adventure, which led to the discovery of the W and Z bosons. He convinced the director-general at the time – John Adams – to modify the programme of the SPS and transform it into a proton–antiproton collider, with technology that all had to be developed.
My first proposal was written for the US, because I was teaching at Harvard and therefore part of the US system – the most advanced research system in the world at the time. But this did not work for a number of reasons, among them the bureaucracy that was starting to grow there. CERN was a new place, there were people like Léon van Hove and John Adams who had a vision for the future, and they supported my idea that soon became a possible solution. Clearly this kind of idea involves a lot of pressure, hard work, and competition with alternative ideas. There were many competing projects, all aiming to become the new big project for CERN, for which there were funds. Bjorn Wiik wanted to make an electron–proton machine, Pierre Darriulat was pushing for a super-ISR, a superconducting one. All of these ideas were part of a purely scientific debate, without any political influence, and this was very healthy.

Making collisions between two beams, especially between protons and antiprotons, required enormous development, but not that many people. The number of people who developed the proton–antiproton collider – I mean those who made a real intellectual contribution, those who did 99% of the work – were no more than a dozen. We were looking for an answer to a very specific question and we had a very clear idea of what we were looking for.

I left an LHC minus the SSC to the CERN community

Rubbia’s next big challenge after the discovery of the W and Z bosons was his mandate as CERN’s director-general, from January 1989 to December 1993, at a crucial time for setting up CERN’s next big project – the LHC.
The name LHC was invented by us – by myself and a small group of people around me. I remember Giorgio Brianti saying that the acronym LHC could not be used, because it already meant Lausanne Hockey Club, which was, at the time, much more popular for the lay public than a machine colliding high-energy protons. Nowadays things are quite different! We started with a programme that was much less ambitious than the US programme. The Americans were still somehow “cut to the quick” by our proton–antiproton programme, so they had started the SSC project – the Superconducting Super Collider – which would be a huge machine, a much more expensive one, but which was later abandoned. So, when my mandate as director-general finished, I left an LHC minus the SSC to the CERN community.

On 4 July 2012, when CERN announced to the world the discovery of “a new boson”, 30 years after his discovery of the W and Z bosons, Rubbia’ s reaction – from the press conference held at the annual Nobel gathering in Lindau that he was attending – was as enthusiastic as ever.
“This result is remarkable, no question. To get the line at 125 GeV – or 150 protons in terms of mass – which is extremely narrow, has a width of less than 1%, and comes out exactly with a large latitude of precision, with two independent experiments that have done the measurements separately and found the same mass and the same very, very narrow width…well, it’s a fantastic experimental result! It doesn’t happen every day. The last time, as far as I know, was when we discovered the W and Z at CERN and the top at Fermilab. We are in front of a major milestone in the understanding of the basic forces of nature.”

The basic forces of nature are not the only fodder to feed Rubbia’s inextinguishable curiosity and craving for innovation. After his mandate as CERN’s director-general finished at the end of 1993, he fought to bring accelerator technology to a variety of fields, from the production of “sustainable” nuclear energy to the production of new radioisotopes for medicine, from a new engine to shorten interplanetary journeys to innovative solar-energy sources.
“Homo” is essentially “faber” – born to build, to make. Today there are many things that need development and innovation. One of the most urgent problems we have is that the population on our planet is growing too fast. Since I was born, the number of people on Earth has multiplied by a factor of three, but the energy used has grown as the square of the number of people, because each of us consumes more energy. We know today that the primary energy produced is 10 times the quantity produced when I was born – and the planet is paying a price. So I find it normal to wonder, where are we going in the future? Will the children born today have 10 times the energy produced today? Will we have three times the population of today? This is the famous reasoning started by Aurelio Peccei, founder of the Club of Rome – the well-known “limits to growth”, discussed in Italy at least a quarter of a century ago. This is still an important issue and it’s all about energy. And this opens up the question of nuclear energy – the old one versus the new one.

Clearly, nuclear energy has gone through a lot of development, but still the nuclear energy that we have today is fundamentally the same as yesterday’s, based on the ideas brought about by Enrico Fermi in the 1940s. It’s part of the era of the Cold War, of development projects for nuclear energy as a weapon rather than basic research. Today the stakes have changed. So if we want to use nuclei to make energy, which we should, we have to do it on a different basis, with elements and conditions that are fundamentally different from yesterday’s. Three aspects of yesterday’s/today’s nuclear energy are worrying: Hiroshima, Chernobyl, and, more recently, Fukushima is also now part of the family of disasters. And of course there’s the problem of radioactive waste. These aspects are no longer manageable in the same way that they were during the Cold War’s golden era. We obviously have to change. And it’s the scientist’s task to improve things. Planes in the 1940s and 1950s scared everybody. My father never boarded a plane. Today everyone does. Why? Because we accepted and modified the technology to make planes super safe. We have to make nuclear energy super safe.

How does someone who has witnessed the entire history of CERN – often first-hand – see its future, and the future of physics?
The LHC brought an enormous change to CERN, whereby today the collaboration with the rest of the world, with non-European countries like the US and Japan, is rather a co-operation than a competition. The LHC transformed CERN from a European laboratory into the main laboratory for an entire research field across the whole world. But this is not without disadvantages, because competition has its benefits. Having a single world-laboratory doing a specific thing is a big risk. If there is only one way of doing things, there is no alternative, unless alternatives come from the inside. But alternatives coming from the inside have a difficult life because the feeling of continuity prevails over innovation. Fortunately, we have an experimental programme and all the elements are now there to conduct high-precision research, and we are on the verge of turning a new page.

I do not know what the next page will be and I would prefer to let nature decide what we physicists will find next. But one thing is clear: with 96% of the universe still to be fathomed, we are faced with an absolutely extraordinary situation, and I wonder whether a young person who wants to study physics today, and is told that 96% of the mass and energy of the universe is yet to be understood, feels excited. Obviously they should feel as excited as I did when I was told about elementary particles. Innovative knowledge, the surprise effect, exists today, still continues to exist and is very strong, provided there are people capable of perceiving it.

CERN will have to choose a new director-general soon. If you had a chance to take that position again, what would your policy for the laboratory be?
I always said that physics at CERN has to be “broad band”. It cannot be “narrow band”. Transforming the SPS into a p–p collider and cooling antiprotons were not part of the programme, and we had the flexibility and freedom to do it. We built the LHC while LEP was still functioning – that was a broad-band scientific policy. The problem is, you never know where the next discovery will come from! Our field is made of surprises, and only a broad-band physics programme can guarantee the future of CERN.

CERN and ITER cooperate

In November 2006, the last LHC dipole and quadrupole cold masses arrived at CERN, signalling the end of the industrial construction of the major components of the new 27-km particle collider (CERN Courier October 2006 p28 and January/February 2007 p25). The LHC then entered the installation and the commissioning phases. In the same month, at the Elysée Palace in Paris, the ITER Agreement was signed by seven parties: China, the EU, India, Japan, Korea, Russia and the US. The agreement’s ratification on October the following year marked the start of a new mega-science project – ITER, standing originally for the International Tokamak Experimental Reactor – that in many respects is the heir of the LHC. Both machines are based on, for example, a huge superconducting magnet system, large cryogenic plants of unmatched power, a large volume of ultra-high vacuum, a complex electrical powering system, sophisticated interlock and protection systems, high-technology devices and work in highly radioactive environments.

The two projects share many technologies and operating conditions and are both based on large international collaborations. These elements constitute the basis for a natural collaboration between the two projects, despite there being distinct differences between their managerial and sociological models.

In the years 2007–2012, CERN could not engage in new large projects, not only because effort was focussed on installation and commissioning of the LHC – and repair and consolidation (CERN Courier April 2009 p6) – but also because of budgetary constraints set by the repayment of loans for its construction. Many groups and departments at CERN faced a related reduction of personnel. In contrast, the new ITER organization had to be staffed and become immediately operational to organize the procurement arrangements between ITER and the domestic agencies acting for the seven members. Indeed, some new staff members were recruited from laboratories that had just finished their engagement with the LHC, such as the CEA in France and CERN itself. However, the number of staff, compounded by the need to train some of them, was not sufficient to satisfy ITER’s needs. For example, the ITER magnet system – perhaps the largest technical challenge of the whole project – required many further detailed studies before the design could be brought to sufficient maturity to allow hand-over to the domestic agencies for construction. The ITER magnet management was also interested in benefitting from the technical skills and project-management experience for large-scale procurement from industry that CERN had accumulated during construction of the LHC.

In addition to the primary reasons for collaboration between CERN and ITER, there were additional reasons that made it interesting to both parties. For CERN there was the possibility of conducting R&D and studies in key domains, despite the lack of new projects and internal funding. Examples include:

  • – the superconductor reference laboratory, set up for the ITER organization, which has proved to be useful for CERN’s internal programme, formally launched in 2011, for the new High Luminosity LHC;
  • – qualification of new commercial nuclear-radiation-hard optical fibres, with measurements also at cryogenic temperatures;
  • – design of high-temperature superconductor (HTS) 70-kA-class current leads, with sophisticated 3D simulations and experimental mock-ups;
  • – setting up a unique high-voltage laboratory for cryo-testing insulation and instrumentation equipment;
  • – new concepts and controllers for the HTS current leads and magnet protection units; and
  • – activities in metallurgy, welding and material testing, which have helped to increase CERN’s already world-renowned competence in this domain.

The list could be longer. Only a minor part of the activity was supplying a “service” or the transfer of knowledge. In many cases the activity was new design, new R&D or validation of beyond-state-of-the-art concepts.

For ITER, the benefit lay not only in receiving the services and studies, for which it paid. It was also in having access to a large spectrum of competence in a single organization. CERN could react promptly to the demands and needs stemming from contracts and unexpected difficulties in the multiparty complex system set up for ITER construction.

Discussions between CERN and ITER management started in 2007 and were formalized with a framework co-operation agreement signed at CERN by the director-generals of the two organizations on 6 March 2008. This agreement foresaw a co-ordination committee that was in fact not set up until 2012, and has met only twice so far, because the collaboration is working so smoothly that no issues have been raised. The collaboration was then implemented through contracts, called implementing agreements (IAs) to the co-operation agreement. Each IA contract details the specific content, goals, deliverables, time duration and resources.

Table 1 lists the 18 IAs signed so far between CERN and ITER. Each year from 2008, an IA was signed according to the needs of ITER and the possibilities and interest at CERN. Standard annual IAs span one calendar year and contain a variety of different tasks – these are annual IAs. However, IAs with extended durations of up to five years soon became necessary to secure long-term service by giving CERN the possibility of hiring new personnel in excess of those allowed by the internal budget. In total, CERN has had eight annual contracts so far, one short-term contract (IA12) and nine multiyear contracts, two of them lasting five years – one for operation of the superconductor reference laboratory (IA4) and one for metallurgy and material testing for the magnet system (IA14).

As already mentioned, the Co-ordination Committee was not set up until 2012, so the various agreements were overseen by a Steering Committee – later renamed the Technical Committee to distinguish it better from the Co-ordination Committee – which is composed of two members per party. The membership of these committees has been relatively constant and this continuity in management, with smooth changes, is probably one of the reasons for the success of the collaboration. Also, some IAs started outside the usual entry points and were later adjusted to report inside the framework. The CERN-ITER collaboration is a textbook case of how managing relations between complex organizations that are at the centre of a network of institutes is an endless job.

The steering and technical committees meet twice a year and each session is prepared carefully. The committee members review the technical work and resolve resource problems by reshuffling various tasks or setting up amendments to the IAs – which has happened only five times and never for extra costs, only to adjust the execution of work to needs. Long-term planning of work and of future agreements is done in the meetings for the best use of resources, and problems are tackled at their outset. So far, no disputes, even minor ones, have occurred.

As in any sustainable collaboration, there are deep discussions on the allocation of resources, most being personnel related, with only a minor part being about consumables. Figure 1 shows the budget that was allocated for the execution of the agreement. The total of more than CHF14 million engaged corresponds approximately to 80–90 full-time equivalent years used by CERN to fulfil the agreement. Most personnel are CERN staff, in some cases recruited ad hoc, but fellows and associated personnel are also involved.

The examples in figure 2 show a few of the most important technical achievements. One of the key ingredients of the success of the CERN-ITER collaboration is that checks are done on deliverables, rather than on detailed accounting or time-sheet reporting. This has been possible because of the technical competence of both the management and the technical leaders of the various tasks, as well as of the personnel involved, on both sides. Goals and deliverables, even the most difficult ones, were evaluated correctly and reasonable resources allocated at the outset, with a fair balance and good appreciation of margins. This leads to the conclusion that – despite modern management guidelines – technical competence is not a nuisance: it can make the difference.

Thoughts on CERN’s future

On its 60th birthday, CERN should, first of all, be justly praised for its scientific results – results that stem from the organization’s unique model, which has allowed the modernization of fragmented, hierarchical and largely closed scientific and academic systems in Europe. In my view, CERN’s indirect influence on the evolution of research institutions and universities has not been sufficiently recognized yet.

As a European intergovernmental scientific organization, CERN has proved a robust and sustainable model. Its institutional framework has inspired other successful international collaboration endeavours in science and technology in Europe. CERN owes part of this success to its special way of being European, therefore becoming a model for a certain humanist idea of Europe. The organization has evolved legally to allow for wider participation, and has “implicitly” created new entities that operate as “world organizations”, such as ATLAS and CMS. In the long run, this fruitful evolution could require visionary choices at a global level: CERN must remain firmly European to be globally attractive.

CCvie2_08_14

CERN’s intergovernmental nature has also provided a progressive balance of collective ambition and self-interest. It allows small and large nations, including non-member states, to contribute according to their specificities and size, and to extend their contributions by investing in experiments and in technological R&D.

Building on the recognition of all of these factors is key for the future institutional evolution of CERN itself.

In this respect, CERN needs to address the challenge to intergovernmental organizations that is triggered by the incomplete EU institutional framework. Moreover, current EU financial rules are largely unspecialized, and therefore less appropriate for the specificity of research, even if much has been done to minimize these difficulties. However, intergovernmental research organizations have accumulated a vast experience in responsibly managing public funding and stimulating industrial innovation under rules appropriate to frontier science and technology. As EU budgets for research are likely to be strengthened in the future, the EU’s role in new international research ventures might be expected to increase. Contributing to the right institutional environment is, therefore, an issue that we must address.

CERN is also recognized for attracting to Europe talent, ideas and resources from the world at large. However, keeping this unique role requires aiming relentlessly at locating the heart of the world’s best research infrastructure in Europe. The issue of the location of the next generation of accelerators will, therefore, be decisive – both for CERN and for Europe.

Becoming a world leader comes with a price. Could CERN act as an incubator for a new world organization for non-accelerator particle physics, namely astroparticle physics research? The infrastructure in this field must be widely distributed, but the evaluation of priorities – as a result of joint scientific, political and technical expertise – could sit together at CERN, or at a new organization outside Europe that CERN might help to frame.

CERN is also justly recognized as a driver for networking with other fields of science and technology, as well as for new applications. We might expect Europe’s pressing social and economic needs to increasingly require research organizations to improve their direct contribution to the creation of new start-up companies, products and services, and new jobs in Europe. This should be seen as a challenge to be met.

A major opportunity seems now to be at hand. CERN’s initiative for a new R&D international open facility for the biomedical, biophysical and bioengineering communities (e.g. medical imaging and new accelerators for hadron therapy) should, in my view, be considered as an important priority for the laboratory and for its member states.

On the other hand, the societal responsibility of research is bound to increase sharply if science development in Europe progresses faster. Under economic constraints, strengthening and widening the social constituencies for science becomes more important. CERN contributes generously to formal and informal science education. I would, however, suggest that fundamental physics has not yet contributed to its full potential to the needs of general experimental science education in schools, or to the demands of modern science centres.

Finally, CERN is justly proud of having continuously pursued one of the original purposes for its creation in the aftermath of the Second World War: science for peace. Its commitment to peace has been greatly influential in the discreet and stable support for scientists and students from regions of conflict and in painful periods, as well as in more visible support to the Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) project in Jordan.

However, much more is now needed from us all. The social responsibility of scientists for peace is now desperately needed against war, oppression and misery, increased fanaticism, and against national stereotyping as an insidious prelude to blind acceptance of the inevitability of war. Let us hope that science continues to make bridges for peace.

One Hundred Physics Visualizations Using MATLAB

By Dan Green
World Scientific
Hardback (with DVD): £48
Paperback (with DVD: £23
E-book: £17

showCoverImage matlab

The aim of this book is to have an interactive MATLAB script where the user can vary parameters in a specific problem and then immediately see the outcome by way of dynamic “movies” of the response of the system in question. MATLAB tools are used throughout, and the software scripts accompany the text in symbolic mathematics, classical mechanics, electromagnetism, waves and optics, gases and fluid flow, quantum mechanics, special and general relativity, and astrophysics and cosmology. The emphasis is on building up an intuition by running many different parametric choices chosen actively by the user and watching the subsequent behaviour of the system.

Modern Functional Quantum Field Theory: Summing Feynman Graphs

By Herbert M Fried
World Scientific
Hardback: £65
E-book: £49

41EiD1T+DsL._SX312_BO1,204,203,200_

These pages offer a simple, analytic, functional approach to non-perturbative QFT, using a frequently overlooked functional representation of Fradkin to calculate explicitly relevant portions of the Schwinger generating functional. In QED, this corresponds to summing all Feynman graphs representing virtual photon exchange between charged particles. It is then possible to see, analytically, the cancellation of an infinite number of perturbative, UV logarithmic divergences, leading to an approximate but reasonable statement of finite-charge renormalization. A similar treatment of QCD is then able to produce a simple, analytic derivation of quark-binding potentials. An extension into the QCD binding of two nucleons to make an effective deuteron presents a simple, analytic derivation of nuclear forces. Finally, a new QED-based solution of vacuum energy is presented as a possible candidate for dark energy.

A Brief History of String Theory: From Dual Models to M-Theory

By Dean Rickles
Springer
Hardback: £35.99 €32.12 $49.99
E-book: £27.99 €42.79 $39.99
Also available at the CERN bookshop

CCboo1_07_14

String theory provides a theoretical framework for unifying particle physics and gravity that is also consistent at the quantum level. Apart from particle physics, it also sheds light on a vast range of problems in physics and mathematics. For example, it helps in understanding certain properties of gauge theories, black holes, the early universe and even heavy-ion physics.

This new book fills a gap by reviewing the 40-year-plus history of the subject, which it divides into four parts, with the main focus on the earlier decades. The reader learns about the work of researchers in the early days in detail, where so-called dual models were investigated with the aim of describing hadron physics. It took ingenious insights to realize that the underlying physical interpretation is in terms of small, oscillating strings. Some of the groundbreaking work took place at CERN – for example, the discovery of the Veneziano amplitude.

The reader obtains a good impression of how it took many years of collective effort and struggle to develop the theory and understand it better, often incrementally, although sometimes the direction of research changed drastically in a serendipitous manner. For example, at some point there was an unexpected shift of interpretation, namely in terms of gravity rather than hadron physics. Supersymmetry was discovered along the way as well, demonstrating that string theory has been the source and inspiration of many ideas in particle physics, gravity and related fields.

The main strength of the book is the extensively and carefully researched history of string theory, rather than profound explanations of the physics (for which enough books are available). It is full of anecdotes, quotations of physicists at the time, and historical facts, to an extent that makes it unique. Despite the author’s avoidance of technicalities, the book seems to be more suitable for people educated in particle physics, and less suitable for philosophers, historians and other non-experts.

One caveat, however: the history covered in the book more or less stops at around the mid-1990s, and as the author emphasizes, the subject becomes much harder to describe after that, without going into the details more deeply. While some of the new and important developments are mentioned briefly in the last chapter – for example, the gauge/gravity correspondence – they do not get the attention that they deserve in relation to older parts of the history. In other words, while the history has been quite accurately presented until the mid-1990s, the significance of some of its earlier parts is rather overrated in comparison with more recent developments.

In summary, this is a worthwhile and enjoyable book, full of interesting details about the development of one of the main research areas of theoretical physics. It appears to be most useful to scientists educated in related fields, and I would even say that it should be a mandatory read for young colleagues entering research in string theory.

Semiconductor X-Ray Detectors

By B G Lowe and R A Sareen
CRC Press
Hardback: £108

9780429088247

The history and development of Si(Li) X-ray detectors is an important supplement to the knowledge required to achieve full understanding of the workings of SDDs, CCDs, and compound semiconductor detectors. This book provides an up-to-date review of the principles, practical applications, and state-of-the-art of semiconductor X-ray detectors, and describes many of the facets of X-ray detection and measurement using semiconductors – from manufacture to implementation. The initial chapters present a self-contained summary of relevant background physics, materials science and engineering aspects. Later chapters compare and contrast the assembly and physical properties of systems and materials currently employed.

Fission and Properties of Neutron-Rich Nuclei: Proceedings of the Fifth International Conference

By J H Hamilton and A V Ramayya (ed.)
World Scientific
Hardback: £131
E-book: £98

9789814525428

The five-year interval between the international conferences covering fission and properties of neutron-rich nuclei allows for significant new results to be achieved. At the latest in the series, leaders in theory and experiments presented their latest results in areas such as the synthesis of superheavy elements, recent results and new facilities using radioactive ion beams, the structure of neutron-rich nuclei, the nuclear fission process, fission yields and nuclear astrophysics. The conference brought together more than 100 speakers from the major nuclear laboratories, along with leading researchers from around the world.

Statistical Data Analysis for the Physical Sciences

By Adrian Bevan
Cambridge University Press
Hardback: £40 $75
Paperback: £18.99 $31.99

E-book: $26
Also available at the CERN bookshop

CCboo2_07_14

The numerous foundational errors and misunderstandings in this book make it inappropriate for use by students or research physicists at any level. There is space here to indicate only a few of the more serious problems.

The fundamental concepts – probability, probability density function (PDF) and likelihood function – are confused throughout. Likelihood is defined as being “proportional to probability”, and both are confused with a PDF in section 3.8(6). Exercise 3.11 invites the reader to “re-express the PDF as a likelihood function”, which is absurd because the two are functions of different kinds of arguments.

Probability and probability density are confused most notably in section 5.5 (χ2 distribution), where the “probability of χ2” is given as the value of the PDF instead of its integral from χ2 to infinity. (The latter quantity is in fact the p value, which is introduced later in section 8.2, but is needed here already.) The student who evaluates the PDFs labelled P(χ2, ν) in figure 5.6 to do exercises 5.10 to 5.12 will get the wrong answers, but the numbers given in table E11 – miraculously – are correct p values. Fortunately the formulas in the book were not used for the tables.

From the beginning there is confusion about what is Bayesian and what is not. Bayesian probability is defined correctly as a degree of belief, but Bayes’s theorem is introduced in the section entitled “Bayesian probability”, even though it can be used equally well in frequentist statistics, and in fact nearly all of the examples use frequentist probabilities. The different factors in Bayes’s theorem are given Bayesian names (one of which is wrong: the likelihood function is inexplicably called “a priori probability”), but the examples labelled “Bayesian” do not use the theorem in a Bayesian way. Worse, the example 3.7.4, labelled Bayesian, confuses the two arguments of conditional probability throughout, and equation 3.17 is wrong (as can be seen by comparing it with P(A) in section 3.2, which is correct). On the other hand, in section 8.7.1 a similar example – with frequentist probabilities again – is presented clearly and correctly. Example 3.7.5 (also labelled Bayesian) is, as far as I can see, nonsense (what is outcome A?).

The most serious errors occur in chapter 7 (confidence intervals). Confidence intervals are frequentist by definition, otherwise they should be called credible intervals. But the treatment here is a curious mixture of Bayesian, frequentist and pure invention. The definition of the confidence level (CL) is novel and involves integration under a PDF that could be the Bayesian posterior but in some examples turns out to be a likelihood function. Coverage is then defined in a frequentist-inspired way (invoking repeated experiments), but it is not the correct frequentist definition. The Feldman–Cousins (F–C) frequentist method is presented without having described the more general Neyman construction on which it is based. A good treatment of the Neyman construction would have allowed the reader to understand coverage better, which the book identifies correctly as the most important property of confidence intervals. It is true that for discrete (e.g. Poisson) data, the F–C method in general over-covers, but it should also have been stated that for this case any method (including Bayesian) that covers for all parameter values must over-cover for some. The “coverage” that this book claims to be exact for Bayesian methods is not an accepted definition because it represents subjective belief only and does not have the frequentist properties required by physicists.

The Physics of Reality: Space, Time, Matter, Cosmos

By Richard L Amoroso, Louis H Kauffman, Peter Rowlands (ed.)
World Scientific
Hardback: £111
E-book: £83

51sdCnHYvSL._SY445_SX342_QL70_ML2_

As the proceedings of the 8th Symposium Honoring Mathematical Physicist Jean-Pierre Vigier, this book introduces a new method in theory formation, completing the tools of epistemology. Like Vigier himself, the Vigier symposia are noted for addressing avant-garde, cutting-edge topics in contemporary physics. In this, several important breakthroughs are introduced for the first time. The most interesting is a continuation of Vigier’s pioneering work on tight-bound states in hydrogen. The new experimental protocol described not only promises empirical proof of large-scale extra dimensions in conjunction with avenues for testing string theory, but also implies the birth of unified field mechanics, ushering in a new age of discovery.

bright-rec iop pub iop-science physcis connect