Topics

Cloud services take off in the US and Europe

2 September 2019
HEPCloud
Paradigm shift Researchers on the NOvA and CMS experiments have already used HEPCloud to run jobs on the US National Energy Research Scientific Computing Center at Berkeley. Credit: R Kaltschmidt/LBNL

Fermilab has announced the launch of HEPCloud, a step towards a new computing paradigm in particle physics to deal with the vast quantities of data pouring in from existing and future facilities. The aim is to allow researchers to “rent” high-performance computing centres and commercial clouds at times of peak demand, thus reducing the costs of providing computing capacity. Similar projects are also gaining pace in Europe.

“Traditionally, we would buy enough computers for peak capacity and put them in our local data centre to cover our needs,” says Fermilab’s Panagiotis Spentzouris, one of HEPCloud’s drivers. “However, the needs of experiments are not steady. They have peaks and valleys, so you want an elastic facility.” All Fermilab experiments will soon submit jobs to HEPCloud, which provides a uniform interface so that researchers don’t need expert knowledge about where and how best to run their jobs.

The idea dates back to 2014, when Spentzouris and Fermilab colleague Lothar Bauerdick assessed the volumes of data coming from Fermilab’s neutrino programme and the US participation in CERN’s Large Hadron Collider (LHC) experiments. The first demonstration of HEPCloud on a significant scale was in February 2016, when the CMS experiment used it to achieve about 60,000 cores on the Amazon cloud, AWS, and, later that year, to run 160,000 cores using Google Cloud Services. Most recently in May 2018, the NOvA team at Fermilab was able to execute around 2 million hardware threads at a supercomputer at the National Energy Research Scientific Computing Center of the US Department of Energy’s Office of Science. HEPCloud project members now plan to enable experiments to use the state-of-the art supercomputing facilities run by the DOE’s Advanced Scientific Computing Research programme at Argonne and Oak Ridge national laboratories.

Europe’s Helix Nebula

CERN is leading a similar project in Europe called the Helix Nebula Science Cloud (HNSciCloud). Launched in 2016 and supported by the European Union (EU), it builds on work initiated by EIROforum in 2010 and aims to bridge cloud computing and open science. Working with IT contractors, HNSciCloud members have so far developed three prototype platforms and made them accessible to experts for testing.

The results and lessons learned are contributing to the implementation of the European Open Science Cloud

“The HNSciCloud pre-commercial procurement finished in December 2018, having shown the integration of commercial cloud services from several providers (including Exoscale and T-Systems) with CERN’s in-house capacity in order to serve the needs of the LHC experiments as well as use cases from life sciences, astronomy, proton and neutron science,” explains project leader Bob Jones of CERN. “The results and lessons learned are contributing to the implementation of the European Open Science Cloud where a common procurement framework is being developed in the context of the new OCRE [Open Clouds for Research Environments] project.”

The European Open Science Cloud, an EU-funded initiative started in 2015, aims to bring efficiencies and make European research data more sharable and reusable. To help European research infrastructures move towards this open-science future, a €16 million EU project called ESCAPE (European Science Cluster of Astronomy & Particle Physics ESFRI) was launched in February. The 3.5 year-long project led by the CNRS will see 31 facilities in astronomy and particle physics collaborate on cloud computing and data science, including CERN, the European Southern Observatory, the Cherenkov Telescope Array, KM3NeT and the Square Kilometre Array (SKA).

In the context of ESCAPE, CERN is leading the effort of prototyping and implementing a FAIR (findable, accessible, interoperable, reproducible) data infrastructure based on open-source software, explains Simone Campana of CERN, who is deputy project leader of the Worldwide LHC Computing Grid (WLCG). “This work complements the WLCG R&D activity in the area of data organisation, management and access in preparation for the HL-LHC. In fact, the computing activities of the CERN experiments at HL-LHC and other initiatives such as SKA will be very similar in scale, and will likely coexist on a shared infrastructure.”

bright-rec iop pub iop-science physcis connect