Connecting the dots with neural networks

8 November 2022

Deep Learning for Physics Research by Martin Erdmann, Jonas Glombitza, Gregor Kasieczka and Uwe Klemradt, World Scientific

Going deep

The use of deep learning in particle physics has exploded in recent years. Based on INSPIRE HEP’s database, the number of papers in high-energy physics and related fields referring to deep learning and similar topics has grown 10-fold over the last decade. A textbook introducing these concepts to physics students is therefore timely and valuable.

When teaching deep learning to physicists, it can be difficult to strike a balance between theory and practice, physics and programming, and foundations and state-of-the-art. Born out of a lecture series at RWTH Aachen and Hamburg universities, Deep Learning for Physics Research by Martin Erdmann, Jonas Glombitza, Gregor Kasieczka and Uwe Klemradt does an admiral job of striking this balance.

The book contains 21 chapters split across four parts: deep-learning basics, standard deep neural-networks, interpretability and uncertainty quantification, and advanced concepts.

In part one, the authors cover introductory topics including physics data, neural-network building blocks, training and model building. Part two surveys and applies different neural-network structures, including fully connected, convolutional, recurrent and graph neural-networks, while also reviewing multi-task learning. Part three covers introspection, interpretability, uncertainty quantification, and revisits different objective functions for a variety of learning tasks. Finally, part four touches on weakly supervised and unsupervised learning methods, generative models, domain adaptation and anomaly detection. Helping to lower the barrier to entry for physics students to use deep learning in their work, the authors contextualise these methods in real physics-research studies, which is an added benefit compared to similar textbooks.

Deep learning borrows many concepts from physics, which can provide a way of connecting similar ideas in the two fields. A nice example explained in the book is the cross-entropy loss function, which has its origins in the definition of entropy according to Gibbs and Boltzmann. Another example that crops up, although rather late in part three, is the connection between the mean-squared-error loss function and the log-likelihood function for a Gaussian probability distribution, which may be more familiar to physics students accustomed to performing maximum likelihood fits.


Accompanying the textbook is a breadth of free, online Jupyter notebooks (executable Python code in an interactive format), which are available at These curated notebooks are paired with different chapters and immerse students in hands-on exercises. Both the problem and corresponding solution notebooks are available online,  and are accessible to students even without expensive computing hardware as they can be launched on free cloud services such as Google Colab or Binder. In addition, students who have a CERN account can launch the notebooks on CERN’s service for web-based analysis (SWAN) platform.

Advanced exercises include the training and evaluation of a denoising autoencoder for speckle removal in X-ray images and a Wasserstein generative adversarial network for the generation of cosmic-ray-induced air-shower footprints. What is truly exciting about these exercises is their use of physics research examples, many taken from recent publications. Students can see how close their homework exercises and solutions are to cutting-edge research, which can be highly motivating.

In a book spanning less than 300 pages (excluding references), it is impossible to cover everything, especially as new deep-learning methods are developed almost daily. For a more theoretical understanding of the fundamentals of deep learning, readers are advised to consult the classic Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville, while for more recent deep-learning developments in particle physics they are directed to the article “A Living Review of Machine Learning for Particle Physics” by Matthew Feickert and Benjamin Nachman.

With continued interest in deep learning, coverage of a variety of real physics-research examples and a breadth of accessible, online exercises, Deep Learning in Physics Research is poised to be a standard textbook on the bookshelf of physics students for years to come.

bright-rec iop pub iop-science physcis connect