Comsol -leaderboard other pages

Topics

Farewell Daya Bay, hello JUNO

Daya Bay

In October 2007, neutrino physicists broke ground 55 km north-east of Hong Kong to build the Daya Bay Reactor Neutrino Experiment. Comprising eight 20-tonne liquid-scintillator detectors sited within 2 km of the Daya Bay nuclear plant, its aim was to look for the disappearance of electron antineutrinos as a function of distance to the reactor. This would constitute evidence for mixing between the electron and the third neutrino mass eigenstate, as described by the parameter θ13. Back then, θ13 was the least well known angle in the Pontecorvo–Maki–Nakagawa–Sakata matrix, which quantifies lepton mixing, with only an upper limit available. Today, it is the best known angle by some margin, and the knowledge that it is nonzero has opened the door to measuring leptonic CP violation at long-baseline accelerator-neutrino experiments.

Daya Bay was one of a trio of experiments located in close proximity to nuclear reactors, along with RENO in South Korea and Double Chooz in France, which were responsible for this seminal measurement. Double Chooz published the first hint that θ13 was nonzero in 2011, before Daya Bay and RENO established this conclusively the following spring. The experiments also failed to dispel the reactor–antineutrino anomaly, whereby observed neutrino fluxes are a few percent lower than calculations predict. This has triggered a slew of new experiments located mere metres from nuclear-reactor cores, in search of evidence for oscillations involving additional, sterile light neutrinos. As the Daya Bay experiment’s detectors are dismantled, after almost a decade of data taking, the three collaborations can reflect on the rare privilege of having pencilled the value of a previously unknown parameter into the Standard- Model Lagrangian.

Particle physics is fundamental and influential, and deserves to be supported

Yi-Fang Wang

Founding Daya Bay co-spokesperson Yi-Fang Wang says the experiment has had a transformative effect on Chinese particle physics, emboldening the country to explore major projects such as a circular electron–positron collider. “One important lesson we learnt from Daya Bay is that we should just go ahead and do it if it is a good project, rather than waiting until everything is ready. We convinced our government that we could do a great job, that world-class jobs need to be international, and that particle physics is fundamental and influential, and deserves to be supported.”

JUNO

The experiment has also paved the way for China to build a successor, the Jiangmen Underground Neutrino Observatory (JUNO), for which Wang is now spokesperson. JUNO will tackle the neutrino mass hierarchy – the question of whether the third neutrino mass eigenstate is the most or least massive of the three. An evolution of Daya Bay, the new experiment will also measure a deficit of electron antineutrinos, but at a distance of 53 km, seeking to resolve fast and shallow oscillations that are expected to differ depending on the neutrino mass hierarchy. Excavation of a cavern for the 20 kilotonne liquid-scintillator detector 700 m beneath the Dashi hill in Guangdong was completed at the end of 2020. The construction of a concrete water pool is the next step.

The next steps in reactor-neutrino physics will involve an extraordinary miniaturisation

Thierry Lasserre

The detector concept that the three experiments used to uncover θ13 was designed by the Double Chooz collaboration. Thierry Lasserre, one of the experiment’s two founders, recalls that it was difficult, 20 years ago, to convince the community that the measurement was possible at reactors. “It should not be forgotten that significant experimental efforts were also undertaken in Angra dos Reis, Braidwood, Diablo Canyon, Krasnoyarsk and Kashiwazaki,” he says. “Reactor neutrino detectors can now be used safely, routinely and remotely, and some of them can even be deployed on the surface, which will be a great advantage for non-proliferation applications.” The next steps in reactor-neutrino physics, he explains, will now involve an extraordinary miniaturisation to cryogenic detectors as small as 10 grams, which take advantage of the much larger cross section of coherent neutrino scattering.

A day with particles

Outreach must continue, even in a pandemic: if visitors can’t come to the lab, we need to find ways to bring the lab to them. Few outreach initiatives do this as charmingly as “A day with particles” — a short independent film by three ATLAS physicists at Charles University in Prague. Mixing hand-drawn animations, deft sound design and a brisk script targeted at viewers with no knowledge of physics, the 30 minute film follows a day in the life of postdoc Vojtech Pleskot. In its latest pitstop in a worldwide tour of indie film festivals, it won “BEST of FEST (Top Geek)” last week at GeekFestToronto.

We want to break stereotypes about scientists

Vojtech Pleskot

“We just want to show that scientists are absolutely normal people, and that no one needs to fear them,” says Pleskot, who wrote and directed the film alongside producer Martin Rybar and animator Daniel Scheirich. Modest and self-effacing when I interviewed them, the three physicists produced the film with no funding and no prior expertise, beating off competition from well funded projects to win the Canadian award. Even within the vibrant but specialist niche of high-energy-physics geekery, competition included “The world of thinking”, featuring interviews with Ed Witten, Freeman Dyson and others, and a professionally produced film dramatising a love letter from Richard Feynman to his late wife Arline, who passed away while he was working on the Manhattan Project. But Pleskot, Rybar and Scheirich won the judges over with their idiosyncratic distillation of life at the rock-face of discovery. The trio place their film in the context of growing scepticism of science and scientists. “We want to break stereotypes about scientists,” adds Pleskot.

Not every stereotype is broken, and there is room to quibble about some of the details, but grassroots projects such as A Day With Particles boast a quirky authenticity which is difficult to capture through institutional planning, and is well placed to connect emotionally with non-physicists. The film is beautifully paced. Wide-eyed enthusiasm for physics cuts to an adorable glimpse of Pleskot’s two “cute little particles” having breakfast. A rapid hop from Democritus to Rutherford to the LHC cuts to tracking shots of Pleskot making his way through the streets of Prague to the university. The realities of phone conferences, failed grid jobs and being late for lab demonstrations are interwoven with a grad student dancing to discuss her analysis, conversations on free-diving with turtles and the stories of beloved professors recalling life in the communist era. Life as a physicist is good. And life as a physicist is really like this, they insist. “I hope that science communicators will share it far and wide,” says Connie Potter (CERN and ATLAS), who commissioned the film for the 2020 edition of ICHEP, and who was also recognised by the Toronto festival for her indefatigable “indie spirit” in promoting it.

A Day With Particles will next be considered at the World of Film International Festival in Glasgow in June, where it has been selected as a semi-finalist.

CMS targets Higgs-boson pair production

Figure 1

The Higgs boson discovered in 2012 by the ATLAS and CMS experiments is the pinnacle of the scientific results so far at the LHC. Measurements of its couplings to W and Z bosons and to heavy fermions have provided a strong indication that the mechanism of electroweak symmetry breaking is similar to that proposed by Brout, Englert and Higgs (BEH) more than 50 years ago. In this model, the BEH field exists throughout space with a non-zero field strength corresponding to the minimum of the BEH potential. The measurement of the shape of the BEH potential has become one of the main goals of experimental particle physics. It governs not only the nature of the electroweak phase transition in the early universe, when the BEH field gained its non-zero “vacuum expectation value” (VEV), but also the question of whether deeper minima than the present vacuum exist.

The measurement of the production of Higgs-boson pairs gives a direct way to measure λ

Interactions with the BEH VEV give mass not only to the W and Z bosons and the fermions, but also to the Higgs boson itself. If the mass of the Higgs boson is well known, the Standard Model (SM) can therefore predict the Higgs self-coupling, λ – the key unknown parameter in the shape of the BEH potential of the SM. The measurement of the production of Higgs-boson pairs (HH) gives a direct way to measure λ. Higgs-boson pair production is not yet established experimentally, as it is a thousand times less frequent than the production of a single Higgs boson. However, the presence of physics beyond the SM can substantially enhance the HH production rate. The search for HH production at the LHC is therefore an important test of the SM.

Best constraint

A recent result by the CMS collaboration describes a search for HH production in final states with two photons and two b-jets (figure 1). The large data sample collected during LHC Run 2 excludes a HH production rate larger than 7.7 times that predicted by the SM. CMS has set the best constraint to date on the ratio of the measured λ parameter to the SM prediction, κλ = 0.6+6.3–1.8.

The sensitivity of the analysis has been improved by about a factor four over the previous result that used the data collected in 2016, benefitting equally from the increase in luminosity and from a wealth of innovative analysis techniques. The electromagnetic calorimeter of the CMS experiment allows the measurement of H → γγ candidates with excellent resolution (about 1–2%). Advanced machine-learning techniques, including deep neural networks, were introduced to significantly improve the mass resolution of H → bb, from 15% down to 11%. The analysis combines information from the invariant mass of the HH system, reflecting the underlying physics processes, and a multivariate classifier exploring the kinematic properties as well as the identification of photons and b-jets.

Events were categorised to enhance the sensitivity to Higgs production via gluon fusion as well as, for the first time, vector-boson fusion. The latter constrains the quartic coupling between two vector bosons and two Higgs bosons, such as WWHH, which is an extremely rare interaction in the SM. In addition, dedicated categories from a previous analysis were added to account for the associated production of top quarks and a single Higgs boson, and to provide a simultaneous constraint on the top-quark Yukawa coupling and λ. Several hypotheses predicting new physics were also constrained. The results are an encouraging step forwards in the quest to measure the BEH potential and to further interrogate the SM.

Higgs boson gets SMEFT treatment

Figure 1

The growing LHC dataset eight years after the discovery of the Higgs boson allows the experiments to study its properties more and more precisely, searching for hints of physics beyond the Standard Model (SM). New phenomena might occur at energy scales beyond the reach of the LHC, pointing to the existence of so-far undiscovered particles with masses too heavy to be directly produced in 13 TeV proton–proton collisions. Without knowing the exact nature of the new physics, LHC data can be analysed to systematically constrain new types of interactions in the framework of an effective field theory (EFT). One historical EFT example is Fermi’s effective interaction model for nuclear beta decay, which is valid as long as the probed energy scale is well below the mass of the W boson. The move to constrain EFTs rather than signal strengths for couplings marks a new, more comprehensive phase in SM tests at the LHC.

The move to constrain EFTs marks a new, more comprehensive phase in SM tests at the LHC

Almost all types of new physics would give rise to new interactions with SM particles, with different models leaving different EFT footprints. As the underlying dynamics is not known and effects can be subtle, it is important to combine as many measurements as possible across the full spectrum of the LHC research programme.

A new ATLAS analysis presented at the Higgs 2020 conference, held online from 26 to 30 October, takes a first step in this direction. The analysis combines measurements of production cross-sections and kinematic variables of Higgs-boson events in several decay channels (diphoton, four-lepton and di-b-quark decays) to constrain new phenomena within the so-called SMEFT framework. The combination of measurements allows multiple new interactions involving the Higgs boson to be constrained simultaneously. This approach requires fewer hypotheses on the other unconstrained interactions than studying the EFT terms one measurement at a time. The results are therefore more generic and easier to interpret in a broader context.

Predicted to vanish

Figure 1 shows the allowed ranges for the coupling coefficients of new EFT interactions to which the ATLAS combined Higgs analysis is sensitive. The coefficient c(3)Hq, for example, describes the strength of an effective four-particle interaction between two quarks, a gauge boson and the Higgs boson. The SM predicts all these coefficients to vanish, as their corresponding interactions are not present. Significant positive or negative deviations would indicate new physics. For instance, a non-vanishing value of c(3)Hq  would cause deviations from the SM in the ZH and WH cross-sections at high transverse momentum of the Higgs boson, which are not observed in the measured channels.

All measurements are compatible with the SM, indicating that if new physics is present it either has a mass scale larger than 1 TeV (the reference scale for which these results are reported) – or it manifests itself in interactions to which the available measurements are not yet sensitive. In the meantime, thanks to the design of the analysis, the results can be added to wider EFT interpretations that combine measurements from different physics processes (e.g. electroweak- boson or top-quark production) studied by ATLAS and other experiments, providing a consistent and increasingly detailed mapping of the allowed new physics extensions of the SM.

Quark-matter fireballs hashed out in Protvino

QCD phase diagram

The XXXII international workshop of the Logunov Institute for High-Energy Physics of the NRC Kurchatov Institute in Protvino, near Moscow, brought more than 300 physicists together online from 9 to 13 November to discuss “hot problems in hot and cold quark matter”. The focus of the workshop was chiral theories and lattice simulations, which allow estimates beyond perturbation theory for studying the strongly coupled quark–gluon plasma (sQGP) – the hot and/or dense plasma of quarks and gluons that is created in heavy-ion collisions, and which may exist inside neutron stars.

Participants considered the QCD phase diagram (pictured) as a function of temperature, magnetic field (B), baryon and isospin chemical potentials (μB and μI), and varying quark masses. The crossover line (yellow strip), which marks a transition between hadronic matter and sQGP, has long attracted great interest. Vladimir Skokov (Brookhaven) employed recent progress in the Lee–Yang approach to phase transitions to derive from first principles that μB > 400 MeV at the critical end point (a possible termination of the first-order phase-transition boundary). Discussions of the phase diagram also included a decrease in the pseudocritical temperature with B, the possibility of a first-order phase transition at μB = 0 as B tends to infinity, the existence and location of a superconducting phase, the disagreement between measured and predicted collective flows of direct photons in heavy-ion collisions, and the diamagnetic and paramagnetic natures of the pion gas and deconfined matter, respectively. Evgeny Zabrodin (Oslo) explained that the rotating fireballs of strongly interacting matter that are produced in heavy-ion collisions are not only superfluids but also supervortical liquids.

Gravitational-wave astrophysics

Impressive work was also shared at the intersection of heavy-ion collisions and gravitational-wave astrophysics on the subject of the equation of state (EoS) of neutron-star cores. The EoS is the relationship between pressure and density, and can indicate whether hadronic or quark matter is inside. Theoretical bounds on the EoS come from chiral effective theories, perturbative QCD, and the bound on the speed of sound cs < 1/3. The quantities that can be extracted from experimental data are the mass–radius relation and the relationship between the tidal deformabilities of merging neutron stars and the peak frequency of the emitted gravitational waves. Several speakers observed that tidal deformabilities, which are measured in the inspiral phase, and the peak gravitational- wave frequency, which is measured in the post-merger phase, may together reveal the state of a neutron-star interior. Mergers observed since 2017 may already be able to shed light on the existence of a deconfined phase inside these ultra-compact objects.

Mariana Araújo offered a solution to the longstanding quarkonium polarisation puzzle

The Protvino workshop also revealed the enduring importance of studying heavy-quark physics. Since heavy quarks can be considered as approximately statically coloured sources, studies of quarkonia production are a step towards understanding hadron formation and the confinement mechanism. Peter Petreczky (Brookhaven) concluded from a lattice study of Bethe–Salpeter amplitudes that the potential model fails to describe bottomonium in terms of screened potential at high temperatures, with further investigations clearly needed in this field. Carlos Lourenço (CERN) showed that the lowering of quarkonia binding energies in the sQGP leads to nontrivial measured suppression patterns. Eric Braaten (Ohio) showed that the decrease with multiplicity of the ratio of the prompt production rates of X(3872) and Ψ(2S) in proton–proton collisions can be explained by the scattering of co-moving pions off X(3872) if it is a weakly bound charm-meson molecule. With equally impressively scrupulousness, Mariana Araújo (Innsbruck) offered a solution to the longstanding “quarkonium polarisation puzzle” by making use of a model-independent fitting procedure and taking into account correlations between cross sections and polarisations.

The next “hot problems” workshop will be held in November.

HPC computing collaboration kicks off

CERN has welcomed more than 120 delegates to an online kick-off workshop for a new collaboration on high-performance computing (HPC). CERN, SKAO (the organisation leading the development of the Square Kilometre Array), GÉANT (the pan-European network and services provider for research and education) and PRACE (the Partnership for Advanced Computing in Europe) will work together to realise the full potential of the coming generation of HPC technology for data-intensive science.

It is an exascale project for an exascale problem

Maria Girone

“It is an exascale project for an exascale problem,” said Maria Girone, CERN coordinator of the collaboration and CERN openlab CTO, in opening remarks at the workshop. “HPC is at the intersection of several important R&D activities: the expansion of computing resources for important data-intensive science projects like the HL-LHC and the SKA, the adoption of new techniques such as artificial intelligence and machine learning, and the evolution of software to maximise the potential of heterogeneous hardware architectures.”

The 29 September workshop, which was organised with the support of CERN openlab, saw participants establish the collaboration’s foundations, outline initial challenges and begin to define the technical programme. Four main initial areas of work were discussed at the event: training and centres of expertise, benchmarking, data access, and authorisation and authentication.

One of the largest challenges in using new HPC technology is the need to adapt to heterogeneous hardware. This involves the development and dissemination of new programming skills, which is at the core of the new HPC collaboration’s plan. A number of examples showing the potential of heterogeneous systems were discussed. One is the EU-funded DEEP-EST project, which is developing a modular supercomputing prototype for exascale computing. DEEP-EST has already contributed to the re-engineering of high-energy physics algorithms for accelerated architectures, highlighting the significant mutual benefits of collaboration across fields when it comes to HPC. PRACE’s excellent record of providing support and training will also be critical to the success of the collaboration.

Benchmarking press

Establishing a common benchmark suite will help the organisations to measure and compare the performance of different types of computing resources for data-analysis workflows from astronomy and particle physics. The suite will include applications representative of the HEP and astrophysics communities – reflecting today’s needs, as well as those of the future – and augment the existing Unified European Applications Benchmark Suite.

Access is another challenge when using HPC resources. Data from the HL-LHC and the SKA will be globally distributed and will be moved over high-capacity networks, staged and cached to reduce latency, and eventually processed, analysed and redistributed. Accessing the HPC resources themselves involves adherence to strict cyber-security protocols. A technical area devoted to authorisation and authentication infrastructure is defining demonstrators to enable large scientific communities to securely access protected resources.

The collaboration will now move forward with its ambitious technical programme. Working groups are forming around specific challenges, with the partner organisations providing access to appropriate testbed resources. Important activities are already taking place in all four areas of work, and a second collaboration workshop will soon be organised.

AEgIS on track to test free-fall of antimatter

AEgIS

The AEgIS collaboration at CERN’s Antiproton Decelerator (AD) has reported a milestone in its bid to measure the gravitational free-fall of antimatter – a fundamental test of the weak equivalence principle. Using a series of techniques developed in 2018, the team demonstrated the first pulsed production of antihydrogen, which allows the time at which the antiatoms are formed to be known with high accuracy. This is a key step in determining “g” for antimatter.

“This is the first time that pulsed formation of antihydrogen has been established on timescales that open the door to simultaneous manipulation, by lasers or external fields, of the formed atoms, as well as to the possibility of applying the same method to pulsed formation of other antiprotonic atoms,” says AEgIS spokesperson Michael Doser of CERN. “Knowing the moment of antihydrogen formation is a powerful tool.”

General relativity’s weak equivalence principle holds that all particles with the same initial position and velocity should follow the same trajectories in a gravitational field. It has been verified for matter with an accuracy approaching 10–14. Since theories beyond the Standard Model such as supersymmetry, or the existence of Lorentz-symmetry violating terms, do not necessarily lead to an equivalent force on matter and antimatter, finding even the slightest difference in g would suggest the presence of quantum effects in the gravitational arena. Indirect arguments constrain possible differences to below 10–6g, but no direct measurement for antimatter has yet been performed due to the difficulty in producing and containing large quantities of it.

ALPHA, AEgIS and GBAR are all targeting a measurement of g at the 1% level in the coming years.

Antihydrogen’s neutrality and long lifetime make it an ideal system in which to test this and other fundamental laws, such as CPT invariance. The first production of low-energy antihydrogen, reported in 2002 by the ATHENA and ATRAP collaborations at the AD, involved a three-body recombination reaction (e++e++pH+e+) involving clouds of antiprotons and positrons. Since then, steady progress by the AD’s ALPHA collaboration in producing, manipulating and trapping ever larger quantities of antihydrogen has enabled spectroscopic and other properties of antimatter to be determined in exquisite detail.

Whereas three-body recombination results in an almost continuous antihydrogen source, in which it is not possible to tag the time of the antiatom formation, AEgIS has employed an alternative charge-exchange process between trapped and cooled antiprotons and positronium (e+e bound system). Bursts of positrons are accelerated and then implanted into a nano-channelled silicon target above an electromagnetic trap containing cold antiprotons, where, with the aid of laser pulses, they produce a cloud of excited positronium a few millimetres across. This can lead to the formation of antihydrogen within sub-μs timescales, the moment of production being defined by the wellknown laser firing time and the transit time of positronium toward the antiproton cloud. Since the antihydrogen is not trapped in the apparatus, it drifts in all directions until it annihilates on the surrounding material, producing pions and photons that are detected by a scintillating array read out by photomultipliers. The scheme allows the time at which 90% of the atoms are produced to be determined with an uncertainty of around 100 ns.

Further steps are required before the measurement of g can begin, explains Doser. These include the formation of a pulsed beam, greater quantities of antihydrogen, and the ability to make it colder. “With only three months of beam time this year, and lots of new equipment to commission, most likely 2022 will be the year in which we establish pulsed beam formation, which is a prerequisite for us to perform a gravity measurement.”

Targeted approach

Following a proof-of-principle measurement of g for antihydrogen by the ALPHA collaboration in 2013, ALPHA, AEgIS and a third AD experiment, GBAR, are all targeting a measurement of g at the 1% level in the coming years. In contrast to AEgIS’s approach, whereby the vertical deviation of a pulsed horizontal beam of cold antihydrogen atoms will be measured in an approximately 1 m-long flight tube, GBAR will take advantage of advances in ion-cooling techniques to measure ultraslow antihydrogen atoms as they fall from a height of 20 cm. ALPHA, meanwhile, will release antihydrogen atoms from a vertical magnetic trap and measure the distribution of annihilation positions when they hit the wall – ramping the trap down slowly so that the coldest atoms, which are most sensitive to gravity, come out last. All three experiments have recently been hooked up to the AD’s ELENA synchrotron, which enables the production of very low-energy antiprotons.

Given that most of the mass of antinuclei comes from massless gluons that bind their constituent quarks, physicists think it unlikely that antimatter experiences an opposite gravitational force to matter and therefore “falls up”. Nevertheless, precise measurements of the free fall of antiatoms could reveal subtle differences that would open an important crack in current understanding.

Return of the double simplex

Standard Model

Popular representations of the Standard Model (SM) often hide its beautiful weirdness, for example slotting quarks and leptons into boxes and arranging them like a low-grade Mendeleev, or contriving a dartboard arrangement. The “double simplex” scheme invented in 2005 by US theorist Chris Quigg, which was recently given a flashy makeover by Quanta magazine (see image), is much richer (arXiv:hep-ph/0509037).

Jogesh Pati and Abdus Salam’s suggestion, in their 1974 shot at a grand unified theory, that lepton number be regarded as a fourth colour, inspired Quigg to place the leptons at the fourth point of an SU(4) tetrahedron. The additional edges therefore represent possible leptoquark transitions. Left-handed fermion doublets (left) are reflected in the broken mirror of parity to reveal right-handed fermion singlets (right), though Quanta, unlike Quigg perhaps favouring a purely left-handed Majorana mass term, omit possible right-handed neutrinos.

A final distinction is that Quigg chooses to superimpose the left and right simplexes – a term for a generalised triangle or tetrahedron in an arbitrary number of dimensions – while Quanta elects to separate the tetrahedra, and label couplings to the Higgs boson with sweeping loops. This obscures a beautiful feature of Quigg’s design, whereby the Yukawa couplings hypothesised by the SM, which couple the left- and right-handed incarnations of massive fermions in interactions with the Higgs field, link opposite corners of the superimposed double simplex, placing the Higgs boson at the centre of the picture. Quigg, who intended that the double simplex precipitate questions, also points out that the corners of the superimposed tetrahedra define a cube, whose edges suggest a possible new category of feeble interactions yet to be discovered.

Learning language by machine

Lingvist CEO Mait Müntel talks to Rachel Bray

Mait Müntel came to CERN as a summer student in 2004 and quickly became hooked on particle physics, completing a PhD in the CMS collaboration in 2008 with a thesis devoted to signatures of double-charged Higgs bosons. Continuing in the field, he was one of the first to do shifts in the CMS control room when the LHC ramped up. It was then that he realised that the real LHC data looked nothing like the Monte Carlo simulations of his student days. Many things had to be rectified, but Mait admits he was none too fond of coding and didn’t have any formal training. “I thought I would simply ‘learn by doing’,” he says. “However, with hindsight, I should probably have been more systematic in my approach.” Little did he know that, within a few years, he would be running a company with around 40 staff developing advanced language-learning algorithms.

Memory models

Despite spending long periods in the Geneva region, Mait had not found the time to pick up French. Frustrated, he began to take an interest in the use of computers to help humans learn languages at an accelerated speed. “I wanted to analyse from a statistical point of view the language people were actually speaking, which, having spent several years learning both Russian and English, I was convinced was very different to what is found in academic books and courses,” he says. Over the course of one weekend, he wrote a software crawler that enabled him to download a collection of French subtitles from a film database. His next step was to study memory models to understand how one acquires new knowledge, calculating that, if a computer program could intelligently decide what would be optimal to learn in the next moment, it would be possible to learn a language in only 200 hours. He started building some software using ROOT (the object-oriented program and library developed by CERN for data analysis) and, within two weeks, was able to read a proper book in French. “I had included a huge book library in the software and as the computer knew my level of vocabulary, it could recommend books for me. This was immensely gratifying and pushed me to progress even further.” Two months later, he passed the national French language exam in Estonia.

Mait became convinced that he had to do something with his idea. So he went on holiday, and hired two software developers to develop his code so it would work on the web. Whilst on holiday, he happened to meet a friend of a friend, who helped him set up Lingvist as a company. Estonia, he says, has a fantastic start-up and software-development culture thanks to Skype, which was invented there. Later, Mait met the technical co-founder of Skype at a conference, who coincidentally had been working on software to accelerate human learning. He dropped his attempts and became Lingvist’s first investor.

Short-term memory capabilities can differ between five minutes and two seconds!

Mait Müntel

The pair secured a generous grant from the European Union Horizon 2020 programme and things were falling into place, though it wasn’t all easy says Mait: “You can use the analogy of sitting in a nice warm office at CERN, surrounded by beautiful mountains. In the office, you are safe and protected, but if you go outside and climb the mountains, you encounter rain and hail, it is an uphill struggle and very uncomfortable, but immensely satisfying when you reach the summit. Even if you work more than 100 hours per week.”

Lingvist currently has three million users, and Mait is convinced that the technology can be applied to all types of education. “What our data have demonstrated is that levels of learning in people are very different. Short-term memory capabilities can differ between five minutes and two seconds! Currently, based on our data, the older generation has much better memory characteristics. The benefit of our software is that it measures memory, and no matter one’s retention capabilities, the software will help improve retention rates.”

New talents

Faced with a future where artificial intelligence will make many jobs extinct, and many people will need to retrain, competitiveness will be derived from the speed at which people can learn, says Mait. He is now building Lingvist’s data-science research team to grow the company to its full potential, and is always on the lookout for new CERN talent. “Traditionally, physicists have excellent modelling, machine-learning and data-analysis skills, even though they might not be aware of it,” he says.

From CERN technologies to medical applications

By clicking the “Watch now” button you will be taken to our third-party webinar provider in order to register your details.

Want to learn more on this subject?

Besides the intrinsic worth of the knowledge that it generates, particle physics often acts as a trailblazer in developing cutting-edge technologies in the fields of accelerators, detectors and computing. These technologies, and the human expertise associated with them, find applications in a variety of areas, including the biomedical field, and can have a societal impact going way beyond their initial scope and expectations.

This webinar will introduce the knowledge-transfer goals of CERN, give an overview of the Laboratory’s medical-applications-related activities and give examples of the impact of CERN technologies on medtech: from hadrontherapy to medical imaging, flash radiotherapy, computing and simulation tools. It will also touch upon the challenges of transferring the technologies and know-how from CERN to the medtech industry and medical research.

Want to learn more on this subject?

Dr Manuela Cirilli is the deputy group leader of CERN’s Knowledge Transfer (KT) group, whose mission is to maximise the impact of CERN on society by creating opportunities for the transfer of the Laboratory’s technologies and know-how to fields outside particle physics. Manuela leads the Medical Applications section of the KT group and chairs the CERN Medical Applications Project Forum. She has an academic background in particle physics and science communication. In 1997, she started working on the NA48 experiment at CERN, designed to measure CP violation in the kaon system. In 2001, she began working on the construction, commissioning and calibration of the precision muon chambers of the ATLAS experiment at the LHC, until she joined CERN’s KT group in 2010.

In parallel to her career, Manuela has been actively engaging in science communication and popularisation since the early 2000s.



  
bright-rec iop pub iop-science physcis connect