The recent generation of compliant robots enables kinesthetic teaching of novel skills by human demonstration. This enables strategies to transfer tasks to the robot in a more intuitive way than conventional programming interfaces. Programming physical interactions can be achieved by manually guiding the robot to learn the behavior from the motion and force data. To let the robot react to changes in the environment, force sensing can be used to identify constraints and act accordingly. While autonomous exploration strategies in the whole workspace are time consuming, we propose a way to learn these schemes from human demonstrations in an object targeted manner. The presented teaching strategy and the learning framework allow to generate adaptive robot behaviors relying on the robot's sense of touch in a systematically changing environment. A generated behavior consists of a hierarchical representation of skills, where haptic exploration skills are used to touch the environment with the end effector, and relative manipulation skills, which are parameterized according to previous exploration events. The effectiveness of the approach has been proven in a manipulation task, where the adaptive task structure is able to generalize to unseen object locations. The robot autonomously manipulates objects without relying on visual feedback.
Learning Haptic Exploration Schemes for Adaptive Task Execution / Eiband, T.; Saveriano, M.; Lee, D.. - 2019-:(2019), pp. 7048-7054. (Intervento presentato al convegno 2019 International Conference on Robotics and Automation, ICRA 2019 tenutosi a Palais des Congres de Montreal, Canada nel 2019) [10.1109/ICRA.2019.8793934].
Learning Haptic Exploration Schemes for Adaptive Task Execution
Saveriano M.;
2019-01-01
Abstract
The recent generation of compliant robots enables kinesthetic teaching of novel skills by human demonstration. This enables strategies to transfer tasks to the robot in a more intuitive way than conventional programming interfaces. Programming physical interactions can be achieved by manually guiding the robot to learn the behavior from the motion and force data. To let the robot react to changes in the environment, force sensing can be used to identify constraints and act accordingly. While autonomous exploration strategies in the whole workspace are time consuming, we propose a way to learn these schemes from human demonstrations in an object targeted manner. The presented teaching strategy and the learning framework allow to generate adaptive robot behaviors relying on the robot's sense of touch in a systematically changing environment. A generated behavior consists of a hierarchical representation of skills, where haptic exploration skills are used to touch the environment with the end effector, and relative manipulation skills, which are parameterized according to previous exploration events. The effectiveness of the approach has been proven in a manipulation task, where the adaptive task structure is able to generalize to unseen object locations. The robot autonomously manipulates objects without relying on visual feedback.File | Dimensione | Formato | |
---|---|---|---|
Learning_Haptic_Exploration_Schemes_for_Adaptive_Task_Execution.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
5.89 MB
Formato
Adobe PDF
|
5.89 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione