The 3-D digitalization of contents and their visualization using augmented reality (AR) has gained a significant interest within the scientific community. Researchers from various fields have acknowledged the potential of these technologies, actively exploring the ability to provide users with easy access to digitized information, by seamlessly integrating content directly into their field of view. One of the most promising ways to approach the problem in outdoor scenarios consists of the so-called location-based AR, where contents are displayed to the user by integrating satellite positioning [as global navigation satellite system (GNSS)] and inertial [as inertial measurement units (IMUs)] sensors. Although the number of application fields is numerous, the accuracy of the over-imposition of contents in the virtual view still hinders the widespread adoption of such technologies. In this article, we propose the combination of a GNSS device equipped with real-time kinematic (RTK) positioning, and a regular smartphone. To this aim, we implement a novel offline calibration process that leverages the potential of a motion capture (MoCap) system. The proposed solution is capable of ensuring temporal consistency and allows for real-time acquisition at centimeter-level accuracy.
Spatial-Temporal Calibration for Outdoor Location-Based Augmented Reality / Orlandi, L. O.; Depedri, K.; Conci, N.. - In: IEEE SENSORS JOURNAL. - ISSN 1530-437X. - 24:11(2024), pp. 18382-18391. [10.1109/JSEN.2024.3388002]
Spatial-Temporal Calibration for Outdoor Location-Based Augmented Reality
Orlandi L. O.;Conci N.
2024-01-01
Abstract
The 3-D digitalization of contents and their visualization using augmented reality (AR) has gained a significant interest within the scientific community. Researchers from various fields have acknowledged the potential of these technologies, actively exploring the ability to provide users with easy access to digitized information, by seamlessly integrating content directly into their field of view. One of the most promising ways to approach the problem in outdoor scenarios consists of the so-called location-based AR, where contents are displayed to the user by integrating satellite positioning [as global navigation satellite system (GNSS)] and inertial [as inertial measurement units (IMUs)] sensors. Although the number of application fields is numerous, the accuracy of the over-imposition of contents in the virtual view still hinders the widespread adoption of such technologies. In this article, we propose the combination of a GNSS device equipped with real-time kinematic (RTK) positioning, and a regular smartphone. To this aim, we implement a novel offline calibration process that leverages the potential of a motion capture (MoCap) system. The proposed solution is capable of ensuring temporal consistency and allows for real-time acquisition at centimeter-level accuracy.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione