In this paper, we present a large dataset with a variety of mobile mapping sensors collected using a handheld device carried at typical walking speeds for nearly 2.2 km around New College, Oxford as well as a series of supplementary datasets with much more aggressive motion and lighting contrast. The datasets include data from two commercially available devices- a stereoscopic-inertial camera and a multibeam 3D LiDAR, which also provides inertial measurements. Additionally, we used a tripod-mounted survey grade LiDAR scanner to capture a detailed millimeter-accurate 3D map of the test location (containing ∼290 million points). Using the map, we generated a 6 Degrees of Freedom (DoF) ground truth pose for each LiDAR scan (with approximately 3 cm accuracy) to enable better benchmarking of LiDAR and vision localisation, mapping and reconstruction systems. This ground truth is the particular novel contribution of this dataset and we believe that it will enable systematic evaluation which many similar datasets have lacked. The large dataset combines both built environments, open spaces and vegetated areas so as to test localisation and mapping systems such as vision-based navigation, visual and LiDAR SLAM, 3D LiDAR reconstruction and appearance-based place recognition, while the supplementary datasets contain very dynamic motions to introduce more challenges for visual-inertial odometry systems.

The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth / Ramezani, M; Wang, Y; Camurri, M; Wisth, D; Mattamala, M; Fallon, M. - (2020), pp. 4353-4360. ( 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020 Las Vegas, Virtual 25.10.2020 - 29.10.2020) [10.1109/IROS45743.2020.9340849].

The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth

Camurri M
;
2020-01-01

Abstract

In this paper, we present a large dataset with a variety of mobile mapping sensors collected using a handheld device carried at typical walking speeds for nearly 2.2 km around New College, Oxford as well as a series of supplementary datasets with much more aggressive motion and lighting contrast. The datasets include data from two commercially available devices- a stereoscopic-inertial camera and a multibeam 3D LiDAR, which also provides inertial measurements. Additionally, we used a tripod-mounted survey grade LiDAR scanner to capture a detailed millimeter-accurate 3D map of the test location (containing ∼290 million points). Using the map, we generated a 6 Degrees of Freedom (DoF) ground truth pose for each LiDAR scan (with approximately 3 cm accuracy) to enable better benchmarking of LiDAR and vision localisation, mapping and reconstruction systems. This ground truth is the particular novel contribution of this dataset and we believe that it will enable systematic evaluation which many similar datasets have lacked. The large dataset combines both built environments, open spaces and vegetated areas so as to test localisation and mapping systems such as vision-based navigation, visual and LiDAR SLAM, 3D LiDAR reconstruction and appearance-based place recognition, while the supplementary datasets contain very dynamic motions to introduce more challenges for visual-inertial odometry systems.
2020
2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Piscataway, NJ
IEEE Institute of Electrical and Electronics Engineers Inc.
978-1-7281-6213-3
Ramezani, M; Wang, Y; Camurri, M; Wisth, D; Mattamala, M; Fallon, M
The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth / Ramezani, M; Wang, Y; Camurri, M; Wisth, D; Mattamala, M; Fallon, M. - (2020), pp. 4353-4360. ( 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020 Las Vegas, Virtual 25.10.2020 - 29.10.2020) [10.1109/IROS45743.2020.9340849].
File in questo prodotto:
File Dimensione Formato  
20_ramezani2020iros.pdf

Solo gestori archivio

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 5.17 MB
Formato Adobe PDF
5.17 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/433310
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 206
  • ???jsp.display-item.citation.isi??? 162
  • OpenAlex ND
social impact