AI/Machine Learning/Deep Learning
Clouds and Distributed Computing
TimeTuesday, June 26th3:15pm - 3:45pm
DescriptionThe DEEP Hybrid DataCloud project aims to provide a bridge towards a more flexible exploitation of intensive computing resources by the research communities, enabling access to the latest technologies and specialised HPC hardware, like GPUs or low-latency interconnects, to be able to explore large datasets. We will integrate the intensive computing services under a Hybrid Cloud approach, assuring interoperability with unmodified HPC resources as well as the upcoming EOSC platform. This will simplify access to resources that may not be easily reachable by researchers at scale.
A number of pilot applications exploiting very large datasets in e.g. Biology, Earth Observation, Network Security, and Physics are proposed. They will be integrated in a testbed with significant HPC resources, including latest generation GPUs, to evaluate the performance and scalability of the solutions. A DevOps approach will be implemented to provide the chain to ensure the quality of the software and services released, that will also be offered to the developers of research applications.
We aim to release a “DEEP as a Service” solution to offer an easy integration path for new communities and users to develop applications requiring deep learning techniques, parallel post-processing of very large data, and analysis of massive online data streams.
The project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 777435, and started on November 1, 2017, with the runtime of 30 months.