In this work, we propose a novel visuo-haptic guidance interface to enable mobile collaborative robots to follow human instructions in a way understandable by non-experts. The interface is composed of a haptic admittance module and a human visual tracking module. The haptic guidance enables an individual to guide the robot end-effector in the workspace to reach and grasp arbitrary items. The visual interface, on the other hand, uses a real-time human tracking system and enables autonomous and continuous navigation of the mobile robot towards the human, with the ability to avoid static and dynamic obstacles along its path. To ensure a safer human-robot interaction, the visual tracking goal is set outside of a certain area around the human body, entering which will switch robot behaviour to the haptic mode. The execution of the two modes is achieved by two different controllers, the mobile base admittance controller for the haptic guidance and the robot's whole-body impedance controller, that enables physically coupled and controllable locomotion and manipulation. The proposed interface is validated experimentally, where a human-guided robot performs the loading and transportation of a heavy object in a cluttered workspace, illustrating the potential of the proposed Follow-Me interface in removing the external loading from the human body in this type of repetitive industrial tasks.
A Visuo-Haptic Guidance Interface for Mobile Collaborative Robotic Assistant (MOCA) / Lamon, Edoardo; Fusaro, Fabio; Balatti, Pietro; Kim, Wansoo; Ajoudani, Arash. - (2020), pp. 11253-11260. (Intervento presentato al convegno 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2020 tenutosi a Las Vegas nel 25th-30th October 2020) [10.1109/IROS45743.2020.9341357].
A Visuo-Haptic Guidance Interface for Mobile Collaborative Robotic Assistant (MOCA)
Lamon, Edoardo
Primo
;
2020-01-01
Abstract
In this work, we propose a novel visuo-haptic guidance interface to enable mobile collaborative robots to follow human instructions in a way understandable by non-experts. The interface is composed of a haptic admittance module and a human visual tracking module. The haptic guidance enables an individual to guide the robot end-effector in the workspace to reach and grasp arbitrary items. The visual interface, on the other hand, uses a real-time human tracking system and enables autonomous and continuous navigation of the mobile robot towards the human, with the ability to avoid static and dynamic obstacles along its path. To ensure a safer human-robot interaction, the visual tracking goal is set outside of a certain area around the human body, entering which will switch robot behaviour to the haptic mode. The execution of the two modes is achieved by two different controllers, the mobile base admittance controller for the haptic guidance and the robot's whole-body impedance controller, that enables physically coupled and controllable locomotion and manipulation. The proposed interface is validated experimentally, where a human-guided robot performs the loading and transportation of a heavy object in a cluttered workspace, illustrating the potential of the proposed Follow-Me interface in removing the external loading from the human body in this type of repetitive industrial tasks.File | Dimensione | Formato | |
---|---|---|---|
IROS_2020_Visuo_Haptic_Follow_Me_compressed.pdf
accesso aperto
Tipologia:
Post-print referato (Refereed author’s manuscript)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
916.76 kB
Formato
Adobe PDF
|
916.76 kB | Adobe PDF | Visualizza/Apri |
A_Visuo-Haptic_Guidance_Interface_for_Mobile_Collaborative_Robotic_Assistant_MOCA.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
1.48 MB
Formato
Adobe PDF
|
1.48 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione