Bluefors – leaderboard other pages

Topics

Deep physics brought to life through photography

The 2018 Global Physics Photowalk brought hundreds of amateur and professional photographers to 18 laboratories around the world, including CERN, to capture their scientific facilities and workforce. The science of the participating labs ranges from exploring the origins of the cosmos to understanding our planet’s climate, and from improving human and animal health to helping deliver secure and sustainable food and energy supplies for the future.

Following local competitions, each lab submitted its top three images to the global competition. A public online vote chose the top three from those images, and a jury of expert photographers and scientists also picked their three favourites. The photowalk was organised by the Interactions collaboration, and was supported by the Royal Photographic Society and Association of Science-Technology Centers (ASTC). The winning entries, shown here, were announced on 30 September at the ASTC annual conference in Hartford, Connecticut.

Simon Wright bagged first place in the expert jury’s choice with this shot taken at the UK’s STFC Boulby Underground Laboratory, which is located 1.1 km underground in Europe’s deepest operating mine and contributes to the search for dark matter. The photograph captures STFC’s Tamara Leitan as she scanned an information board at the lab. To highlight Leitan’s face, Wright used a miner’s lamp instead of a flash to minimise interference with light reflected from the safety equipment that workers must wear at the mine.

Simon Wright received another award, this time third prize in the people’s choice category, for this image of green fluorescent lighting at an underground tunnel at the UK’s STFC Chibolton Observatory, which is home to a wide range of science facilities.

Jon McRae took third place in the expert jury’s selection, as well as second place in the people’s choice, for this photo of the DESCANT neutron detector at Canada’s TRIUMF laboratory. The detector can be mounted on the TIGRESS and GRIFFIN experiments to study nuclear structure. Holding a small, spherical lens between the camera and the detector array, McRae recreated a miniature simulacrum of DESCANT in the crystal-clear glass ball.

Stefano Ruzzini won the expert jury’s second prize for this photograph of a silicon-strip particle detector, which was first used in CERN’s NA50 experiment but is now at Italy’s INFN Frascati National Laboratories. The photo was praised by the judges for portraying the three-dimensional aspect of the detector.

This picture from Gianluca Micheletti was also awarded third place in the expert jury’s selection. It shows a researcher observing the XENON1T dark-matter experiment at Italy’s INFN Gran Sasso National Laboratories. The judges commended Micheletti’s composition of the image in evoking the sense of curiosity at the heart of physics.

Luca Riccioni snapped a picture of the KLOE-2 experiment at Italy’s INFN Frascati National Laboratories, which recently concluded its data-taking campaign at the DAΦNE electron–positron collider. The photograph was awarded first place in the people’s choice category.

Hands-on education at the frontiers of science

Winning students from 2018

Is it fun to learn physics from a textbook? According to many teenage participants in CERN’s Beamline for Schools (BL4S) programme, physics lessons at school are much too theoretical. Students from some countries do not even have physics lessons at all, let alone any contact with current science.

Many years back, in 2011, experimental particle physicist Christoph Rembser of CERN had an idea to get high-school students engaged with particle physics by offering them the chance to carry out their own experiment on a CERN beamline. Three years later, the 60th anniversary of CERN in 2014 offered an opportunity for what was meant to be a one-off worldwide science competition: BL4S was born. With the help of media attention in CERN around the time of the anniversary, teams of high-school students and their teachers were invited to propose an experiment at CERN. The response was overwhelming: almost 300 teams involving more than 3000 students from 50 countries submitted a proposal.

When the first two teams came to CERN in September 2014, it was clear that BL4S would not be a one-off event. Clearly the competition had the potential to attract large numbers of high-school students every year to get deeply involved with physics at the crucial stage in their education, two years before leaving school to take up further study. Ashish Tutakne, a member of the 2018 winning team from the Philippines, sums this up: “I believe the experience holds significant weight as it is not only a chance to collaborate with some of the smartest people in the world on a scientific project, it is also a taste of what conducting research is actually like. It is this experience that I believe that will in fact prove valuable to me … throughout the rest [of] my life.”

CERN and society

Thanks to the huge success of the first edition, institutes and foundations around the world also recognised the potential of the competition. Through the CERN & Society Foundation, an independent charitable organisation supported by private donors, BL4S has since been provided with the financial help without which it would not have been possible to turn the competition into an annual event. The CERN & Society Foundation has the aim of spreading CERN’s spirit of scientific curiosity for the benefit of society, and supports young talent through high-quality, hands-on training. This year, for example, in addition to the BL4S initiative, the foundation has helped more than 80 educators participate in CERN’s national teacher programme and granted more than 60 Summer Student scholarships.

Winning teams so far

So far, more than 900 teams with almost 8500 students from 76 countries have taken part in the BL4S competition, with one third of these students being female. While in the first edition in 2014 about 70% of the teams came from member states of CERN, this year roughly two-thirds of the participating teams were from associate and non-member states. This emphasises the international character of the competition and its global appeal.

The announcement of each edition of BL4S is usually made during the summer the year before, with a deadline for submitting a proposal of up to 1000 words and a one-minute video by 31 March. After about two months of evaluation, involving more than 50 volunteer physicists, the two winning teams and up to 30 shortlisted teams are announced in June. Besides certificates, which every participant receives, the shortlisted teams win special prizes such as BL4S T-shirts for every team member. The two winning teams are finally invited to CERN in September/October for a period of about 12 days to carry out their experiments.

Of course they are not doing this alone, but are guided by two professional scientists. These scientists, typically young PhD students in physics, make the largest contribution to the success of BL4S. They are not only responsible for the fine-tuning and implementation of the experiments of the winning teams but have, in collaboration with the CERN detector workshops, also developed bespoke devices for use in the BL4S experiments. Even though these support scientists were only involved with the project for less than a year, it offered them the opportunity to carry out a complete physics experiment from the beginning to the end; the skills that they acquired helped several of them to find interesting postdoc positions.

Beamline specifics

From the beginning, BL4S attracted a lot of CERN staff members as well as users and even retired staff to make voluntary contributions to the organisation of the event. This involves answering questions from the student teams, evaluating proposals, developing detectors and software, helping the winners with the analysis of the data, and many other things. These volunteers have become a crucial part of the competition.

The BL4S beamline at CERN

CERN’s accelerator complex is vast, and is in constant use by thousands of physicists worldwide. Since the first edition, the BL4S experiments have taken place at the T9 beamline of the Proton Synchrotron fixed-target area in the “East Hall” on the main CERN site. This beamline offers a secondary beam with a momentum of 0.5–10 GeV/c and a mixture of electrons, pions, kaons, protons and some muons. Regarding detectors, CERN provides a range of technologies: scintillators, Cherenkov counters, delay wire chambers, multigap resistive plate chambers, micro-mesh gaseous structure detectors, lead-glass calorimeters and Timepix detectors. In addition, students are allowed to build their own detectors and bring them to CERN. For the triggering, NIM modules are used, while the data-acquisition system is based on the RCD-TDAQ system of the ATLAS experiment. The student teams are provided with a detailed document that describes all of these components.

The students are completely free with respect to the experiment and use of these materials as long as it does not raise any safety concerns. Quite often we are surprised by their creativity, and the ten winning proposals from the past five years illustrate the wide spectrum of their ideas (see table above). Besides these winning proposals, all of the proposals received show what captures the attention of curious teenagers. Just a few examples are: the shielding of spacecraft to protect astronauts from the dangers of cosmic radiation; the analysis of the atmosphere with respect to greenhouse gasses; the exploration of natural resources; the creation of artificial aurora borealis; and the artistic translation of signals of elementary particles into sights or sounds.

For a successful participation in BL4S, the role of teachers and other mentors is paramount. Many teachers do not feel confident enough and might not propose their students to take part in BL4S. Partly they feel not qualified enough for such a challenge or they do not get the support from their schools that is necessary to coach a team for many weeks if not months. After all, many teachers are severely limited in the time that they can devote to such activities. In some cases, the students go ahead without any mentors and complete their proposal in a self-directed way. In other cases, they contact physicists at local universities or at one of the national or regional contact points established in almost 30 individual countries. Usually, however, the main burden is on the teacher and we are very grateful to the many teachers who every year dedicate a substantial part of their free time to coach a team of students. Unfortunately, our surveys show that due to the high workload only a few teachers are able to participate several years in a row.

The effect that BL4S has on the many students that are not lucky enough to be invited to CERN is difficult to assess. We know, however, via feedback from several teachers, that BL4S is appreciated as a means of motivating their students. In addition, the students themselves often write that their participation was a great experience for them and many are even motivated to work on their proposals and improve them to take part again in the next edition.

The winning teams are encouraged to stay together after having been at CERN and to write a paper about their experiment. So far, three papers have been published in an international peer-reviewed journal, Physics Education, with the following titles: Building and testing a high school calorimeter at CERN; The secret chambers in the Chephren pyramid; and Testing the validity of the Lorentz factor (see further reading). Papers are typically published one to two years after the completion of the experiment. At least one further paper is currently in the pipeline. This is not a mandatory step for the teams, but it represents a unique opportunity to have authored a scientific publication before even starting at university.

According to a recent survey among the previous winners, most take up studies of natural sciences, engineering or mathematics. Max Raven of the 2016 winning team “Relatively Special” from Colchester Royal Grammar School in the UK remarked: “The most beneficial impact of BL4S has been the strong team-working and communication skills I developed… This invaluable experience has been instrumental to developing my interpersonal skills, which are vital for a successful career in engineering.” After taking part in the BL4S competition Raven was accepted to study engineering by the Massachusetts Institute of Technology.

Students and teachers alike are clearly very happy to be associated with the competition, and this also benefits CERN and its educational aims. Winning BL4S often creates a lot of media attention in the home region of the teams or even at the national level, and recently the two Italian teams that won BL4S in 2015 and 2017 were invited to the ministry of foreign affairs in Rome for a special ceremony. At the same time, BL4S makes a contribution to physics education by leading students into a field of physics rarely touched upon in school curricula. Being able to do hands-on physics with detectors and accelerators used also for other current experiments presents a huge motivation for students to learn even in their free time. Yash Karan, a member of the Philippine winning team in 2018, remarked: “I have learnt much more in the last two weeks at CERN than in the last six months in school!”

Next stop DESY

At the end of this year, CERN’s accelerator complex will be shut down for a period of two years to make way for maintenance and upgrades, in particular for the High-Luminosity LHC. This opens a new chapter in the history of the BL4S competition. In close collaboration with the DESY laboratory in Germany, the competition will continue there in 2019. DESY will provide beam time at the DESY II facility, offering electron and positron beams, and employ a dedicated support scientist on a three-year staff contract. Other institutes such as INFN-Frascati in Italy and the Paul Scherrer Institut in Switzerland are also interested in hosting the competition in the future.

What remains is the never-ending challenge of spreading the word. Even though CERN has many traditional and modern channels of communication, making BL4S known to high-school students and teachers around the world takes the effort of a large number of people at all levels. In particular, volunteers are needed to spread the word in their region and through their available channels, where they play several roles: acting as additional regional contacts for candidate teams; providing coaching if no teacher is available; taking part in the evaluation of proposals; assisting the winning teams with their data analysis and writing of scientific papers; and, finally, finding additional sponsors. Anyone interested can contact the BL4S team via bl4s.team@cern.ch.

As this article went to press, the 2018 winners were completing their experiments, which were hugely successful. All students claimed to have gained an immense increase in knowledge and they admired the passion that surrounded them everywhere they went at CERN. Working together in mixed shift crews each day, the teams have also learned about one another’s experiments, fostering cooperation and personal growth. Quotes such as “Beamline for Schools was a life-changing experience” are not uncommon, and many of this year’s students have made up their minds that they would like to pursue a career in particle physics or engineering.

The registration and proposal-submission for BL4S 2019 are now open. Hopefully the next edition will attract even more students from all around the globe to participate in this unique opportunity.

LHCb discovers two new baryons

Resonant structure

A report from the LHCb experiment

Although the quark model of hadrons is highly successful in describing how the quarks combine to form baryons and mesons, the internal mechanisms governing the dynamics of the strong force that binds quarks inside those hadrons are far from fully understood. By studying new hadronic resonances and their excited states, light can be shed on these mechanisms.

LHCb physicists have recently observed, for the first time, two new baryons. These states, named Σb(6097)+ and Σb(6097), occur as resonances appearing in the two-body system Λb0π±, which consists of a neutral Λb0 baryon and a charged π meson (see figure). The statistical significances of the observations are 12.7σ and 12.6σ, well above the threshold for discovery.

The new particles are members of the Σb family of baryons. Four of the six so-called ground states of this family, the Σb+, Σb, Σb*+, and Σb*, were previously discovered by the CDF collaboration at the Tevatron. LHCb also reports a study of the properties of these four ground states, measuring them with unprecedented statistics and improving the precision on their masses and widths by a factor of approximately five.

Establishing precisely how the new Σb(6097)+ and Σb(6097) states fit into this family is not straightforward. Theoretical predictions for a number of excited Σb states exist, including five Σb(1P) states with expected masses close to the values seen by LHCb – though some of them may be difficult to observe experimentally. Since it’s possible for different excited states to have similar masses, it can’t be excluded that the newly observed mass peaks are actually superpositions of more than one state. Further input from theory, and future experimental studies with more data and in other final states, will help resolve this question.

The meson sector is also capable of providing surprising results. Evidence for another new hadron has recently been reported by LHCb in a Dalitz plot analysis of B0 decays to ηc(1S) K+ π. A structure, which could be a new resonance in the ηc(1S) π system, was detected with a significance of more than three standard deviations. While this does not meet the threshold for discovery, it is an intriguing hint and will be pursued with more data. If confirmed, this new Zc(4100) resonance would be one of a small number of manifestly exotic mesons that cannot be described as a quark–anti-quark pair but must instead have a more complicated structure, such as being a tetraquark combination of two quarks and two antiquarks.

Gravitational hunt for extra dimensions

Figure 1

General relativity predicts very accurately how objects fall from a table and how planets move within the solar system. At larger scales, however, some issues arise. The most glaring is the theory’s prediction of the motion of stars within a galaxy and of the acceleration of galaxies away from each other, both of which are at odds with observations. Models containing dark matter and dark energy can solve these two problems, respectively. Another potential solution is that space–time contains additional dimensions, modifying general relativity. Such additional dimensions are not observable with electromagnetic waves, but new information gleaned from gravitational waves (GWs) are allowing such models to be tested for the first time.

Some modifications of general relativity, such as the Dvali–Gabadadze–Porrati (DGP) model, involve the addition of extra dimensions accessible to gravity. If such extra dimensions are large, and thus not rolled up to a microscopic size as predicted by some beyond-Standard Model theories, part of the gravitational field would “leak” into the extra dimensions. Therefore, GWs arriving at detectors such as those of the LIGO and VIRGO observatories would be weaker than expected.

The first GWs detected, in September 2015, came from distant black-hole binaries. For such objects, there is no electromagnetic-wave counterpart, so the only information astronomers have about their distance from Earth is from the GWs themselves, making it impossible to check if some of the wave’s intensity was lost. However, GW170817, the first observed merger of binary neutron stars, produced both GWs and electromagnetic radiation, which was measured by a wide range of instruments (CERN Courier December 2017 p16). As a result, we know in which galaxy the merger took place and therefore have a good measurement of the distance the GWs travelled. Using this distance measurement and the measured strength of the GW signal, one can test whether the signal follows general relativity or a model with additional dimensions.

Doing exactly this, a group led by Kris Pardo from Princeton University has found that the results are most compatible with the standard 3+1 space–time-dimensions picture. Assuming two values for the Hubble constant, as required due to a large discrepancy between values obtained by two different methods (CERN Courier May 2018 p17), the researchers show that, regardless of the value assumed, the results allow for a total of 4.0 ± 0.1 dimensions (see figure).

The authors also obtained an upper limit on the graviton’s lifetime of 450 million years. As is the case with a potential leakage of gravity into extra dimensions, the decay of gravitons propagating towards Earth would also cause the strength of the GW signal to decrease.

These findings are just the beginning of the physics studies made possible by gravitational-wave astronomy. As the authors make clear in their paper, the results only affect theories with finite but large-scale extra dimensions. That may change, however, as more GWs are expected to be measured, with increased precision, in the future. One promising parameter capable of probing a larger set of models is the polarisation of the GWs. For the GW170817 system, polarisation information was not available at the time of observation owing to the limited number of GW detectors. Any higher-dimensional model allows for extra GW polarisation modes, which can be studied with the help of additional GW detectors such as the planned KAGRA and IndIGO facilities.

With a future global array of GW detectors, we can look forward to more studies in this field of physics which, until now, has been almost inaccessible.

The roots and fruits of string theory

What led you to the 1968 paper for which you are most famous?

In the mid-1960s we theorists were stuck in trying to understand the strong interaction. We had an example of a relativistic quantum theory that worked: QED, the theory of interacting electrons and photons, but it looked hopeless to copy that framework for the strong interactions. One reason was the strength of the strong coupling compared to the electromagnetic one. But even more disturbing was that there were so many (and ever growing in number) different species of hadrons that we felt at a loss with field theory – how could we cope with so many different states in a QED-like framework? We now know how to do it and the solution is called quantum chromodynamics (QCD).

Gabriele Veneziano

But things weren’t so clear back then. The highly non-trivial jump from QED to QCD meant having the guts to write a theory for entities (quarks) that nobody had ever seen experimentally. No one was ready for such a logical jump, so we tried something else: an S-matrix approach. The S-matrix, which relates the initial and final states of a quantum-mechanical process, allows one to directly calculate the probabilities of scattering processes without solving a quantum field theory such as QED. This is why it looked more promising. It was also looking very conventional but, eventually, led to something even more revolutionary than QCD – the idea that hadrons are actually strings.

Is it true that your “eureka” moment was when you came across the Euler beta function in a textbook?

Not at all! I was taking a bottom-up approach to understand the strong interaction. The basic idea was to impose on the S-matrix a property now known as Dolen–Horn–Schmid (DHS) duality. It relates two apparently distinct processes contributing to an elementary reaction, say a+b c+d. In one process, a+b fuse to form a metastable state (a resonance) which, after a characteristic lifetime, decays into c+d. In the other process the pair a+c exchanges a virtual particle with the pair b+d. In QED these two processes have to be added because they correspond to two distinct Feynman diagrams, while, according to DHS duality, each one provides, for strong interactions, the whole story. I’d heard about DHS duality from Murray Gell-Mann at the Erice summer school in 1967, where he said that DHS would lead to a “cheap bootstrap” for the strong interaction. Hearing this being said by a great physicist motivated me enormously. I was in the middle of my PhD studies at the Weizmann Institute in Israel. Back there in the fall, a collaboration of four people was formed. It consisted of Marco Ademollo, on leave at Harvard from Florence, and of Hector Rubinstein, Miguel Virasoro and myself at the Weizmann Institute. We worked intensively for a period of eight-to-nine months trying to solve the (apparently not so) cheap bootstrap for a particularly convenient reaction. We got very encouraging results hinting, I was feeling, for the existence of a simple exact solution. That solution turned out to be the Euler beta function.

Veneziano in July 1968

But the 1968 paper was authored by you alone?

Indeed. The preparatory work done by the four of us had a crucial role, but the discovery that the Euler beta function was an exact realisation of DHS duality was just my own. It was around mid-June 1968, just days before I had to take a boat from Haifa to Venice and then continue to CERN where I would spend the month of July. By that time the group of four was already dispersing (Rubinstein on his way to NYU, Virasoro to Madison, Wisconsin via Argentina, Ademollo back to Florence before a second year at Harvard). I kept working on it by myself, first on the boat, then at CERN until the end of July when, encouraged by Sergio Fubini, I decided to send the preprint to the journal Il Nuovo Cimento.

Was the significance of the result already clear?

Well, the formula had many desirable features, but the reaction of the physics community came to me as a shock. As soon as I had submitted the paper I went on vacation for about four weeks in Italy and did not think much about it. At the end of August 1968, I attended the Vienna conference – one of the biennial Rochester-conference series – and found out, to my surprise, that the paper was already widely known and got mentioned in several summary talks. I had sent the preprint as a contribution and was invited to give a parallel-session talk about it. Curiously, I have no recollection of that event, but my wife remembers me telling her about it. There was even a witness, the late David Olive, who wrote that listening to my talk changed his life. It was an instant hit, because the model answered several questions at once, but it was not at all apparent then that it had anything to do with strings, not to mention quantum gravity.

When was the link to “string theory” made?

The first hints that a physical model for hadrons could underlie my mathematical proposal came after the latter had been properly generalised (to processes involving an arbitrary number of colliding particles) and the whole spectrum of hadrons it implied was unraveled (by Fubini and myself and, independently, by Korkut Bardakci and Stanley Mandelstam). It came out, surprisingly, to closely resemble the exponentially growing (with mass) spectrum postulated almost a decade earlier by CERN theorist Rolf Hagedorn and, at least naively, it implied an absolute upper limit on temperature (the so-called Hagedorn temperature).

The spectrum coincides with that of an infinite set of harmonic oscillators and thus resembles the spectrum of a quantised vibrating string with its infinite number of higher harmonics. Holger Nielsen and Lenny Susskind independently suggested a string (or a rubber-band) picture. But, as usual, the devil was in the details. Around the end of the decade Yoichiro Nambu (and independently Goto) gave the first correct definition of a classical relativistic string, but it took until 1973 for Goddard, Goldstone, Rebbi and Thorn to prove that the correct application of quantum mechanics to the Nambu–Goto string reproduced exactly the above-mentioned generalisation of my original work. This also included certain consistency conditions that had already been found, most notably the existence of a massless spin-1 state (by Virasoro) and the need for extra spatial dimensions (from Lovelace’s work). At that point it became clear that the original model had a clear physical interpretation of hadrons being quantised strings. Some details were obviously wrong: one of the most striking features of strong interactions is their short-range nature, while a massless state produces long-range interactions. The model being inconsistent for three spatial dimensions (our world!) was also embarrassing, but people kept hoping.

So string theory was discovered by accident?

Not really. Qualitatively speaking, however, having found that hadrons are strings was no small achievement for those days. It was not precisely the string we now associate with quark confinement in QCD. Indeed the latter is so complicated that only the most powerful computers could shed some light on it many decades later. A posteriori, the fact that by looking at hadronic phenomena we were driven into discovering string theory was neither a coincidence nor an accident.

When was it clear that strings offer a consistent quantum-gravity theory?

This very bold idea came as early as 1974 from a paper by Joel Scherk and John Schwarz. Confronted with the fact that the massless spin-1 string state refused to become massive (there is no Brout–Englert–Higgs mechanism at hand in string theory!) and that even a massless spin-2 string had to be part of the string spectrum, they argued that those states should be identified with the photon and the graviton, i.e. with the carriers of electromagnetic and gravitational interactions, respectively. Other spin-1 particles could be associated with the gluons of QCD or with the W and Z bosons of the weak interaction. String theory would then become a theory of all interactions, at a deeper, more microscopic level. The characteristic scale of the hadronic string (~10–13 cm) had to be reduced by 20 orders of magnitude (~10–33 cm, the famous Planck-length) to describe the quarks themselves, the electron, the muon and the neutrinos, in fact every elementary particle, as a string.

In addition, it turned out that a serious shortcoming of the old string (namely its “softness”, meaning that string–string collisions cannot produce events with large deflection angles) was a big plus for the Scherk–Schwarz proposal. While the data were showing that hard hadron collisions were occurring at substantial rates, in agreement with QCD predictions, the softness of string theory could free quantum gravity from its problematic ultraviolet divergences – the main obstacle to formulating a consistent quantum-gravity theory.

Did you then divert your attention to string theory?

Not immediately. I was still interested in understanding the strong interactions and worked on several aspects of perturbative and non-perturbative QCD and their supersymmetric generalisations. Most people stayed away from string theory during the 1974–1984 decade. Remember that the Standard Model had just come to life and there was so much to do in order to extract its predictions and test it. I returned to string theory after the Green–Schwarz revolution in 1984. They had discovered a way to reconcile string theory with another fact of nature: the parity violation of weak interactions. This breakthrough put string theory in the hotspot again and since then the number of string-theory aficionados has been steadily growing, particularly within the younger part of the theory community. Several revolutions have followed since then, associated with the names of Witten, Polchinski, Maldacena and many others. It would take too long to do justice to all these beautiful developments. Personally, and very early on, I got interested in applying the new string theory to primordial cosmology.

Was your 1991 paper the first to link string theory with cosmology?

I think there was at least one already, a model by Brandenberger and Vafa trying to explain why our universe has only three large spatial dimensions, but it was certainly among the very first. In 1991, I (and independently Arkadi Tseytlin) realised that the string-cosmology equations, unlike Einstein’s, admit a symmetry (also called, alas, duality!) that connects a decelerating expansion to an accelerating one. That, I thought, could be a natural way to get an inflationary cosmology, which was already known since the 1980s, in string theory without invoking an ad-hoc “inflaton” particle.

The problem was that the decelerating solution had, superficially, a Big Bang singularity in its past, while the (dual) accelerating solution had a singularity in the future. But this was only the case if one neglected effects related to the finite size of the string. Many hints, including the already mentioned upper limit on temperature, suggested that Big Bang-like singularities are not really there in string theory. If so, the two duality-related solutions could be smoothly connected to provide what I dubbed a “pre-Big Bang scenario” characterised by the lack of a beginning of time. I think that the model (further developed with Maurizio Gasperini and by many others) is still alive, at least as long as a primordial B-mode polarisation is not discovered in the cosmic microwave background, since it is predicted to be insignificant in this cosmology.

Did you study other aspects of the new incarnation of string theory?

A second line of string-related research, which I have followed since 1987, concerns the study of thought experiments to understand what string theory can teach us about quantum gravity in the spirit of what people did in the early days of quantum mechanics. In particular, with Daniele Amati and Marcello Ciafaloni first, and then also with many others, I have studied string collisions at trans-Planckian energies (> 1019 GeV) that cannot be reached in human-made accelerators but could have existed in the early universe. I am still working on it. One outcome of that study, which became quite popular, is a generalisation of Heisenberg’s uncertainty principle implying a minimal value of Δx of the order of the string size.

50 years on, is the theory any closer to describing reality?

People say that string theory doesn’t make predictions, but that’s simply not true. It predicts the dimensionality of space, which is the only theory so far to do so, and it also predicts, at tree level (the lowest level of approximation for a quantum-relativistic theory), a whole lot of massless scalars that threaten the equivalence principle (the universality of free-fall), which is by now very well tested. If we could trust this tree-level prediction, string theory would be already falsified. But the same would be true of QCD, since at tree level it implies the existence of free quarks. In other words: the new string theory, just like the old one, can be falsified by large-distance experiments provided we can trust the level of approximation at which it is solved. On the other hand, in order to test string theory at short distance, the best way is through cosmology. Around (i.e. at, before, or soon after) the Big Bang, string theory may have left its imprint on the early universe and its subsequent expansion can bring those to macroscopic scales today.

What do you make of the ongoing debate on the scientific viability of the landscape, or “swamp”, of string-theory solutions?

I am not an expert on this subject but I recently heard (at the Strings 2018 conference in Okinawa, Japan) a talk on the subject by Cumrun Vafa claiming that the KKLT solution [which seeks to account for the anomalously small value of the vacuum energy, as proposed in 2003 by Kallosh, Kachru, Linde and Trivedi] is in the swampland, meaning it’s not viable at a fundamental quantum-gravity level. It was followed by a heated discussion and I cannot judge who is right. I can only add that the absence of a metastable de-Sitter vacuum would favour quintessence models of the kind I investigated with Thibault Damour several years ago and that could imply interestingly small (but perhaps detectable) violations of the equivalence principle.

What’s the perception of strings from outside the community?

Some of the popular coverage of string theory in recent years has been rather superficial. When people say string theory can’t be proved, it is unfair. The usual argument is that you need unconceivably high energies. But, as I have already said, the new incarnation of string theory can be falsified just like its predecessor was; it soon became very clear that QCD was a better theory. Perhaps the same will happen to today’s string theory, but I don’t think there are serious alternatives at the moment. Clearly the enthusiasm of young people is still there. The field is atypically young – the average age of attendees of a string-theory conference is much lower than that for, say, a QCD or electroweak physics conference. What is motivating young theorists? Perhaps the mathematical beauty of string theory, or perhaps the possibility of carrying out many different calculations, publishing them and getting lots of citations.

What advice do you offer young theorists entering the field?

I myself regret that most young string theorists do not address the outstanding physics questions with quantum gravity, such as what’s the fate of the initial singularity of classical cosmology in string theory. These are very hard problems and young people these days cannot afford to spend a couple of years on one such problem without getting out a few papers. When I was young I didn’t care about fashions, I just followed my nose and took risks that eventually paid off. Today it is much harder to do so.

How has theoretical particle physics changed since 1968?

In 1968 we had a lot of data to explain and no good theory for the weak and strong interactions. There was a lot to do and within a few years the Standard Model was built. Today we still have essentially the same Standard Model and we are still waiting for some crisis to come out of the beautiful experiments at CERN and elsewhere. Steven Weinberg used to say that physics thrives on crises. The crises today are more in the domain of cosmology (dark matter, dark energy), the quantum mechanics of black holes and really unifying our understanding of physics at all scales, from the Planck length to our cosmological horizon, two scales that are 60 orders of magnitude apart. Understanding such a hierarchy (together with the much smaller one of the Standard Model) represents, in my opinion, the biggest theoretical challenge for 21st century physics.

J-PET’s plastic revolution

The J-PET detector

It is some 60 years since the conception of positron emission tomography (PET), which revolutionised the imaging of physiological and biochemical processes. Today, PET scanners are used around the world, in particular providing quantitative and 3D images for early-stage cancer detection and for maximising the effectiveness of radiation therapies. Some of the first PET images were recorded at CERN in the late 1970s, when physicists Alan Jeavons and David Townsend used the technique to image a mouse. While the principle of PET already existed, the detectors and algorithms developed at CERN made a major contribution to its development. Techniques from high-energy physics could now be about to enable another leap in PET technology.

In a typical PET scan, a patient is administered with a radioactive solution that concentrates in malignant cancers. Positrons from β+ decay annihilate with electrons from the body, resulting in the back-to-back emission of two 511 keV gamma rays that are registered in a crystal via the photoelectric effect. These signals are then used to reconstruct an image. Significant advances in PET imaging have taken place in the past few decades, and the vast majority of existing scanners use inorganic crystals – usually bismuth germanium oxide (BGO) or lutetium yttrium orthosilicate (LYSO) – organised in a ring to detect the emitted PET photons.

The main advantage of crystal detectors is their large stopping power, high probability of photoelectric conversion and good energy resolution. However, the use of inorganic crystals is expensive, limiting the number of medical facilities equipped with PET scanners. Moreover, conventional detectors are limited in their axial field of view: currently a distance of only about 20 cm along the body can be simultaneously examined from a single-bed position, meaning that several overlapping bed positions are needed to carry out a whole-body scan, and only 1% of quanta emitted from a patient’s body are collected. Extension of the scanned region from around 20 to 200 cm would not only improve the sensitivity and signal-to-noise ratio, but also reduce the radiation dose needed for a whole-body scan.

To address this challenge, several different designs for whole-body scanners have been introduced based on resistive-plate chambers, straw tubes and alternative crystal scintillators. In 2009, particle physicist Paweł Moskal of Jagiellonian University in Kraków, Poland, introduced a system that uses inexpensive plastic scintillators instead of inorganic ones for detecting photons in PET systems. Called the Jagiellonian PET (J-PET) detector, and based on technologies already employed in the ATLAS, LHCb, KLOE, COSY-11 and other particle-physics experiments, the aim is to allow cost effective whole-body PET imaging.

Whole-body imaging

The current J-PET setup comprises a ring of 192 detection modules axially arranged in three layers as a barrel-shaped detector and the construction is based on 17 patent-protected solutions. Each module consists of a 500 × 19 × 7 mm3 scintillator strip made of a commercially available material called EJ-230, with a photomultiplier tube (PMT) connected at each side. Photons are registered via the Compton effect and each analog signal from the PMTs is sampled in the voltage domain at four thresholds by dedicated field-programmable gate arrays.

In addition to recording the location and time of the electron—positron annihilation, J-PET determines the energy deposited by annihilation photons. The 2D position of a hit is known from the scintillator position, while the third space component is calculated from the time difference of signals arriving at both ends of scintillator, enabling direct 3D image reconstruction. PMTs connected to both sides of the scintillator strips compensate for the low detection efficiency of plastic compared to crystal scintillators and enable multi-layer detection. A modular and relatively easy to transport PET scanner with a non-magnetic and low density central part can be used as a magnetic resonance imaging (MRI) or computed-tomography compatible insert. Furthermore, since plastic scintillators are produced in various shapes, the J-PET approach can be also introduced for positron emission mammography (PEM) and as a range monitor for hadron therapy.

The J-PET detector offers a powerful new tool to test fundamental symmetries

J-PET can also build images from positronium (a bound state of electron and positron) that gets trapped in intermolecular voids. In about 40% of cases, positrons injected into the human body create positronium with a certain lifetime and other environmentally sensitive properties. Currently this information is neither recorded nor used for PET imaging, but recent J-PET measurements of the positronium lifetime in normal and cancer skin cells indicate that the properties of positronium may be used as diagnostic indicators for cancer therapy. Medical doctors are excited by the avenues opened by J-PET. These include a larger axial view (e.g. to check correlations between organs separated by more than 20  cm in the axial direction), the possibility of performing combined PET-MRI imaging at the same time and place, and the possibility of simultaneous PET and positronium (morphometric) imaging paving the way for in vivo determination of cancer malignancy.

Such a large detector is not only potentially useful for medical applications. It can also be used in materials science, where PALS enables the study of voids and defects in solids, while precise measurements of positronium atoms leads to morphometric imaging and physics studies. In this latter regard, the J-PET detector offers a powerful new tool to test fundamental symmetries.

Combinations of discrete symmetries (charge conjugation C, parity P, and time reversal T) play a key role in explaining the observed matter–antimatter asymmetry in the universe (CP violation) and are the starting point for all quantum field theories preserving Lorentz invariance, unitarity and locality (CPT symmetry). Positronium is a good system enabling a search for C, T, CP and CPT violation via angular correlations of annihilation quanta, while the positronium lifetime measurement can be used to separate the ortho- and para-positronium states (o-Ps and p-Ps). Such decays also offer the potential observation of gravitational quantum states, and are used to test Lorentz and CPT symmetry in the framework of the Standard Model Extension.

At J-PET, the following reaction chain is predominantly considered: 22Na 22Ne e+ νe, 22Ne 22Ne γ and e+e o-Ps 3γ annihilation. The detection of 1274 keV prompt γ emission from 22Ne de-excitation is the start signal for the positronium-lifetime measurement. Currently, tests of discrete symmetries and quantum entanglement of photons originating from the decay of positronium atoms are the main physics topics investigated by the J-PET group. The first data taking was conducted in 2016 and six data-taking campaigns have concluded with almost 1 PB of data. Physics studies are based on data collected with a point-like source placed in the centre of the detector and covered by a porous polymer to increase the probability of positronium formation. A test measurement with a source surrounded by an aluminium cylinder was also performed. The use of a cylindrical target (figure 1, left) allows researchers to separate in space the positronium formation and annihilation (cylinder wall) from the positron emission (source). Most recently, measurements by J-PET were also performed with a cylinder with the inner wall covered by the porous material.

Figure 1

The J-PET programme aims to beat the precision of previous measurements for C, CP and CPT symmetry tests in positronium, and to be the first to observe a potential T-symmetry violation. Tests of C symmetry, on the other hand, are conducted via searches for forbidden decays of the positronium triplet state (o-Ps) to 4γ and the singlet state (p-Ps) to 3γ. Tests of the other fundamental symmetries and their combinations will be performed by the measurement of the expectation values of symmetry-odd operators constructed using spin of o-Ps, momenta and polarisation vectors of photons originating from its annihilation (figure 1, right). The physical limit of such tests is expected at the level of about 10−9 due to photo–photon interaction, which is six orders of magnitude smaller than the present experimental limits (e.g. at the University of Tokyo and by the Gammasphere experiment).

Since J-PET is built of plastic scintillators, it provides an opportunity to determine the photon’s polarisation through the registration of primary and secondary Compton scatterings in the detector. This, in turn, enables the study of multi-partite entanglement of photons originating from the decays of positronium atoms. The survival of particular entanglement properties in the mixing scenario may make it possible to extract quantum information in the form of distinct entanglement features, e.g. from metabolic processes in human bodies.

Currently a new, fourth J-PET layer is under construction (figure 2), with a single unit of the layer comprising 13 plastic-scintillator strips. With a mass of about 2 kg per single detection unit, it is easy to transport and to build on-site a portable tomographic chamber whose radius can be adjusted for different purposes by using a given number of such units.

Figure 2

The J-PET group is a collaboration between several Polish institutions – Jagiellonian University, the National Centre for Nuclear Research Świerk and Maria Curie-Skłodowska University – as well as the University of Vienna and the National Laboratory in Frascati. The research is funded by the Polish National Centre for Research and Development, by the Polish Ministry of Science and Higher Education and by the Foundation for Polish Science. Although the general interest in improved quality of medical diagnosis was the first step towards this new detector for positron annihilation, today the basic-research programme is equally advanced. The only open question at J-PET is whether a high-resolution full human body tomographic image will be presented before the most precise test of one of nature’s fundamental symmetries.

Quantum thinking required

Cooling technology

The High-Luminosity Large Hadron Collider (HL-LHC), due to operate in around 2026, will require a computing capacity 50–100 times greater than currently exists. The big uncertainty in this number is largely due to the difficulty in knowing how well the code used in high-energy physics (HEP) can benefit from new, hyper-parallel computing architectures as they become available. Up to now, code modernisation is an area in which the HEP community has generally not fared too well.

We need to think differently to address the vast increase in computing requirements ahead. Before the Large Electron–Positron collider was launched in the 1980s, its computing challenges also seemed daunting; early predictions underestimated them by a factor of 100 or more. Fortunately, new consumer technology arrived and made scientific computing, hitherto dominated by expensive mainframes, suddenly more democratic and cheaper.

A similar story unfolded with the LHC, for which the predicted computing requirements were so large that IT planners offering their expert view were accused of sabotaging the project! This time, the technology that made it possible to meet these requirements was grid computing, conceived at the turn of the millennium and driven largely by the ingenuity of the HEP community.

Looking forward to the HL-LHC era, we again need to make sure the community is ready to exploit further revolutions in computing. Quantum computing is certainly one such technology on the horizon. Thanks to the visionary ideas of Feynman and others, the concept of quantum computing was popularised in the early 1980s. Since then, theorists have explored its mind-blowing possibilities, while engineers have struggled to produce reliable hardware to turn these ideas into reality.

Qubits are the basic units of quantum computing: thanks to quantum entanglement, n qubits can represent 2n different states on which the same calculation can be performed simultaneously. A quantum computer with 79 entangled qubits has an Avogadro number of states (about 1023); with 263 qubits, such a machine could represent as many concurrent states as there are protons in the universe; while an upgrade to 400 qubits could contain all the information encoded in the universe.

However, the road to unlocking this potential – even partially – is long and arduous. Measuring the quantum states that result from a computation can prove difficult, offsetting some of the potential gains. Also, since classical logic operations tend to destroy the entangled state, quantum computers require special reversible gates. The hunt has been on for almost 30 years for algorithms that could outperform their classical counterparts. Some have been found, but it seems clear that there will be no universal quantum computer on which we will be able to compile our C++ code and then magically run it faster. Instead, we will have to recast our algorithms and computing models for this brave new quantum world.

In terms of hardware, progress is steady but the prizes are still a long way off. The qubit entanglement in existing prototypes, even when cooled to the level of millikelvins, is easily lost and the qubit error rate is still painfully high. Nevertheless, a breakthrough in hardware could be achieved at any moment.

A few pioneers are already experimenting with HEP algorithms and simulations on quantum computers, with significant quantum-computing initiatives having been announced recently in both Europe and the US. In CERN openlab, we are now exploring these opportunities in collaboration with companies working in the quantum-computing field – kicking things off with a workshop at CERN in November (see below).

The HEP community has a proud tradition of being at the forefront of computing. It is therefore well placed to make significant contributions to the development of quantum computing – and stands to benefit greatly, if and when its enormous potential finally begins to be realised.

Nanoelectronics: Materials, Devices, Applications (2 volumes)

By R Puers, L Baldi, M Van de Voorde and S E van Nooten (editors)
Wiley–VCH

Nanoelectronics: Materials, Devices, Applications

This book aims to provide an overview of both present and emerging nanoelectronics devices, focusing on their numerous applications such as memories, logic circuits, power devices and sensors. It is one unit (in two volumes) of a complete series of books that are dedicated to nanoscience and nanotechnology, and their penetration in many different fields, ranging from human health, agriculture and food science, to energy production, environmental protection and metrology.

After an introduction about the semiconductor industry and its development, different kinds of devices are discussed. Specific chapters are also dedicated to new materials, device-characterisation techniques, smart manufacturing and advanced circuit design. Then, the many applications are covered, which also shows the emerging trends and economic factors influencing the progress of the nanoelectronics industry.

Since nanoelectronics is nowadays fundamental for any science and technology that requires communication and information processing, this book can be of interest to electronic engineers and applied physicists working with sensors and data-processing systems.

Picturing Quantum Processes: A First Course in Quantum Theory and Diagrammatic Reasoning

By Bob Coecke and Aleks Kissinger
Cambridge University Press

Picturing Quantum Processes

“This book is about telling the story of quantum theory entirely in terms of pictures,” declare the authors of this unusual book, in which quantum processes are explained using diagrams and an innovative method for presenting complex theories is set up. The book employs a unique formalism developed by the authors, which allows a more intuitive understanding of quantum features and eliminates complex calculations. As a result, knowledge of advanced mathematics is not required.

The entirely diagrammatic presentation of quantum theory proposed in this (bulky) volume is the result of 10 years of work and research carried out by the authors and their collaborators, uniting classical techniques in linear algebra and Hilbert spaces with cutting-edge developments in quantum computation and foundational QM.

An informal and entertaining style is adopted, which makes this book easily approachable by students at their first encounter with quantum theory. That said, it will probably appeal more to PhD students and researchers who are already familiar with the subject and are interested in looking at a different treatment of this matter. The text is also accompanied by a rich set of exercises.

Essential Quantum Mechanics for Electrical Engineers

By Peter Deák
Wiley–VCH

Essential Quantum Mechanics for Electrical Engineers

The most recent and upcoming developments of electronic devices for information technology are increasingly being based on physical phenomena that cannot be understood without some knowledge of quantum mechanics (QM). In the new hardware, switching happens at the level of single electrons and tunnelling effects are frequently used; in addition, the superposition of electron states is the foundation of quantum information processing. As a consequence, the study of QM, as well as informatics, is now being introduced in undergraduate electric and electronic engineering courses. However, there is still a lack of textbooks on this subject written specifically for such courses.

The aim of the author was to fill this gap and provide a concise book in which both the basic concepts of QM and its most relevant applications to electronics and information technologies are covered, making use of only the very essential mathematics.

The book starts off with classical electromagnetism and shows its limitations when it comes to describing the phenomena involved in modern electronics. More advanced concepts are then gradually introduced, from wave–particle duality to the mathematical construction used to describe the state of a particle and to predict its properties. The quantum well and tunnelling through a potential barrier are explained, followed by a few applications, including light-emitting diodes, infrared detectors, quantum cascade lasers, Zener diodes, flash memories and the scanning tunnelling microscope. Finally, the author discusses some of the consequences of QM for the chemical properties of atoms and other many-electron systems, such as semiconductors, as well as the potential hardware for quantum information processing.

Even though the mathematical formulation of basic concepts is introduced when required, the author’s approach is oriented at limiting calculations and abstraction in favour of practical applications. Applets, accessible on the internet, are also used as a support, to ease the computational work and quickly visualise the results.

bright-rec iop pub iop-science physcis connect