The miniaturisation of sensors and processors, the advancements in connected edge intelligence, and the exponential interest in Artificial Intelligence are boosting the affirmation of autonomous nano-size drones in the Internet of Robotic Things ecosystem. However, achieving safe autonomous navigation and high-level tasks such as exploration and surveillance with these tiny platforms is extremely challenging due to their limited resources. This work focuses on enabling the safe and autonomous flight of a pocket-size, 30-gram platform called Crazyflie 2.1 in a partially known environment. We propose a novel AIaided, vision-based reactive planning method for obstacle avoidance under the ambit of Integrated Sensing, Computing and Communication paradigm. We deal with the constraints of the nano-drone by splitting the navigation task into two parts: a deep learning-based object detector runs on the edge (external hardware) while the planning algorithm is executed onboard. The results show the ability to command the drone at ∼ 8 frames-per-second and a model performance reaching a COCO mean-average-precision of 60.8. Field experiments demonstrate the feasibility of the solution with the drone flying at a top speed of 1m s while steering away from an obstacle placed in an unknown position and reaching the target destination. The outcome highlights the compatibility of the communication delay and the model performance with the requirements of the realtime navigation task. We provide a feasible alternative to a fully onboard implementation that can be extended to autonomous exploration with nano-drones.
AI and Vision Based Autonomous Navigation of Nano-Drones in Partially-Known Environments / Sartori, Mattia; Singhal, Chetna; Roy, Neelabhro; Brunelli, Davide; Gross, James. - (2025), pp. 307-314. (Intervento presentato al convegno DCOSS-IoT 2025 tenutosi a Lucca, Italy nel June 9-11, 2025) [10.1109/dcoss-iot65416.2025.00058].
AI and Vision Based Autonomous Navigation of Nano-Drones in Partially-Known Environments
Brunelli, Davide;
2025-01-01
Abstract
The miniaturisation of sensors and processors, the advancements in connected edge intelligence, and the exponential interest in Artificial Intelligence are boosting the affirmation of autonomous nano-size drones in the Internet of Robotic Things ecosystem. However, achieving safe autonomous navigation and high-level tasks such as exploration and surveillance with these tiny platforms is extremely challenging due to their limited resources. This work focuses on enabling the safe and autonomous flight of a pocket-size, 30-gram platform called Crazyflie 2.1 in a partially known environment. We propose a novel AIaided, vision-based reactive planning method for obstacle avoidance under the ambit of Integrated Sensing, Computing and Communication paradigm. We deal with the constraints of the nano-drone by splitting the navigation task into two parts: a deep learning-based object detector runs on the edge (external hardware) while the planning algorithm is executed onboard. The results show the ability to command the drone at ∼ 8 frames-per-second and a model performance reaching a COCO mean-average-precision of 60.8. Field experiments demonstrate the feasibility of the solution with the drone flying at a top speed of 1m s while steering away from an obstacle placed in an unknown position and reaching the target destination. The outcome highlights the compatibility of the communication delay and the model performance with the requirements of the realtime navigation task. We provide a feasible alternative to a fully onboard implementation that can be extended to autonomous exploration with nano-drones.| File | Dimensione | Formato | |
|---|---|---|---|
|
AI_and_Vision_Based_Autonomous_Navigation_of_Nano-Drones_in_Partially-Known_Environments.pdf
accesso aperto
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
3.32 MB
Formato
Adobe PDF
|
3.32 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione



