To confidently discover new physics in the muon g−2 anomaly requires that data-driven and lattice-QCD calculations of the Standard-Model value agree, write Thomas Blum, Luchang Jin and Christoph Leh...
The TOOLS 2020 conference attracted around 200 phenomenologists and experimental physicists to work on numerical tools for dark-matter models, and more.
The new collaboration will work to realise the full potential of the coming generation of high-performance computing technology for data-intensive science.
Mait Müntel left physics to found Lingvist, an education company harnessing big data and artificial intelligence to accelerate language learning.
The computing demands expected this decade puts HEP in a similar position to 1995 when the field moved to PCs, argues Sverre Jarp.
CERN’s new quantum technology initiative has the potential to enrich and expand its challenging research programme, says Alberto Di Meglio.
CERN's Graeme Stewart tours six decades of computing milestones in high-energy physics and describes the immense challenges ahead in taming data from future experiments.
FPGAs can now be programmed in C++ and Java, bringing machine learning and complex algorithms within the scope of trigger-level analysis.
Fermilab has announced the launch of HEPCloud, a step towards a new computing paradigm to deal with the vast quantities of data pouring in from existing and future facilities.
The High-Performance Computing for Lebanon project is part of efforts by Lebanese scientists to boost the nation’s research capabilities.