Copernicus is the European programme for monitoring the Earth. It consists of a set of systems that collect data from satellites and in-situ sensors, process this data and provide users with reliable and up-to-date information on a range of environmental and security issues. The data and information processed and disseminated puts Copernicus at the forefront of the big data paradigm, giving rise to all relevant challenges, the so-called 5 Vs: volume, velocity, variety, veracity and value. In this short paper, we discuss the challenges of extracting information and knowledge from huge archives of Copernicus data. We propose to achieve this by scale-out distributed deep learning techniques that run on very big clusters offering virtual machines and GPUs. We also discuss the challenges of achieving scalability in the management of the extreme volumes of information and knowledge extracted from Copernicus data. The envisioned scientific and technical work will be carried out in the context of the H2020 project ExtremeEarth which starts in January 2019.
From copernicus big data to extreme earth analytics / Koubarakis, M.; Bereta, K.; Bilidas, D.; Giannousis, K.; Ioannidis, T.; Pantazi, D. -A.; Stamoulis, G.; Haridi, S.; Vlassov, V.; Bruzzone, L.; Paris, C.; Eltoft, T.; Kramer, T.; Charalabidis, A.; Karkaletsis, V.; Konstantopoulos, S.; Dowling, J.; Kakantousis, T.; Datcu, M.; Dumitru, C. O.; Appel, F.; Bach, H.; Migdall, S.; Hughes, N.; Arthurs, D.; Fleming, A.. - (2019), pp. 690-693. (Intervento presentato al convegno 22nd International Conference on Extending Database Technology: EDBT 2019 tenutosi a Lisbon nel 26th-29th March 2019) [10.5441/002/edbt.2019.88].
From copernicus big data to extreme earth analytics
Bruzzone L.;Paris C.;
2019-01-01
Abstract
Copernicus is the European programme for monitoring the Earth. It consists of a set of systems that collect data from satellites and in-situ sensors, process this data and provide users with reliable and up-to-date information on a range of environmental and security issues. The data and information processed and disseminated puts Copernicus at the forefront of the big data paradigm, giving rise to all relevant challenges, the so-called 5 Vs: volume, velocity, variety, veracity and value. In this short paper, we discuss the challenges of extracting information and knowledge from huge archives of Copernicus data. We propose to achieve this by scale-out distributed deep learning techniques that run on very big clusters offering virtual machines and GPUs. We also discuss the challenges of achieving scalability in the management of the extreme volumes of information and knowledge extracted from Copernicus data. The envisioned scientific and technical work will be carried out in the context of the H2020 project ExtremeEarth which starts in January 2019.File | Dimensione | Formato | |
---|---|---|---|
EDBT19_paper_321.pdf
accesso aperto
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Creative commons
Dimensione
424.75 kB
Formato
Adobe PDF
|
424.75 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione