Haptic sensing has recently been used effectively for legged robot localization in extreme scenarios where cameras and LiDAR might fail, such as dusty mines and foggy sewers. However, existing haptic sensing mainly relies on supervised classification, with training and evaluation executed over explicit terrain classes. Defining classes is a significant limitation to real-world applications, where prior labelling and handcrafted classes are often impractical. This paper proposes a novel haptic localization system based on a fully unsupervised terrain representation learned solely from the force/torque sensors located in the quadruped robot’s feet. Instead of using the detected terrain class for localization, we propose an improved autoencoder architecture to generate a sparse map of encodings on the first run and to localize against this sparse map during subsequent runs. We compare our approach to a haptic localization system based on supervised terrain classification, showing that the unsupervised method has comparable or better performance than the supervised one for the same trajectories while clearly outperforming the proprioceptive odometry estimator available on the robot. Therefore, the proposed approach is well-suited for a routine maintenance application, increasing the platform’s robustness.

Unsupervised Learning of Terrain Representations for Haptic Monte Carlo Localization / Łysakowski, M; Nowicki, M R; Buchanan, R; Camurri, M; Fallon, M; Walas, K. - (2022), pp. 4642-4648. ( 2022 IEEE International Conference on Robotics and Automation (ICRA) Philadelphia, PA, USA 23-27 May 2022) [10.1109/ICRA46639.2022.9812296].

Unsupervised Learning of Terrain Representations for Haptic Monte Carlo Localization

Camurri M
;
2022-01-01

Abstract

Haptic sensing has recently been used effectively for legged robot localization in extreme scenarios where cameras and LiDAR might fail, such as dusty mines and foggy sewers. However, existing haptic sensing mainly relies on supervised classification, with training and evaluation executed over explicit terrain classes. Defining classes is a significant limitation to real-world applications, where prior labelling and handcrafted classes are often impractical. This paper proposes a novel haptic localization system based on a fully unsupervised terrain representation learned solely from the force/torque sensors located in the quadruped robot’s feet. Instead of using the detected terrain class for localization, we propose an improved autoencoder architecture to generate a sparse map of encodings on the first run and to localize against this sparse map during subsequent runs. We compare our approach to a haptic localization system based on supervised terrain classification, showing that the unsupervised method has comparable or better performance than the supervised one for the same trajectories while clearly outperforming the proprioceptive odometry estimator available on the robot. Therefore, the proposed approach is well-suited for a routine maintenance application, increasing the platform’s robustness.
2022
2022 IEEE International Conference on Robotics and Automation (ICRA)
Piscataway NJ
IEEE Institute of Electrical and Electronics Engineers Inc.
9781728196817
Łysakowski, M; Nowicki, M R; Buchanan, R; Camurri, M; Fallon, M; Walas, K
Unsupervised Learning of Terrain Representations for Haptic Monte Carlo Localization / Łysakowski, M; Nowicki, M R; Buchanan, R; Camurri, M; Fallon, M; Walas, K. - (2022), pp. 4642-4648. ( 2022 IEEE International Conference on Robotics and Automation (ICRA) Philadelphia, PA, USA 23-27 May 2022) [10.1109/ICRA46639.2022.9812296].
File in questo prodotto:
File Dimensione Formato  
22_lysakowski2022icra.pdf

Solo gestori archivio

Descrizione: IEEE Xplore logo - conference paper
Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Tutti i diritti riservati (All rights reserved)
Dimensione 5.03 MB
Formato Adobe PDF
5.03 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/433291
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
  • OpenAlex ND
social impact