The increasing number of objects in ubiquitous computing environments creates a need for effective object detection and identification mechanisms that permit users to intuitively initiate interactions with these objects. While multiple approaches to such object detection - including through visual object detection, fiducial markers, relative localization, or absolute spatial referencing - are available, each of these suffers from drawbacks that limit their applicability. In this paper, we propose ODIF, an architecture that permits the fusion of object situation information from such heterogeneous sources and that remains vertically and horizontally modular to allow extending and upgrading systems that are constructed accordingly. We furthermore present BLEARVIS, a prototype system that builds on the proposed architecture and integrates computer-vision (CV) based object detection with radio-frequency (RF) angle of arrival (AoA) estimation to identify BLE-tagged objects. In our system, the front camera of a Mixed Reality (MR) head-mounted display (HMD) provides a live image stream to a vision-based object detection module, while an antenna array that is mounted on the HMD collects AoA information from ambient devices. In this way, BLEARVIS is able to differentiate between visually identical objects in the same environment and can provide an MR overlay of information (data and controls) that relates to them. We include experimental evaluations of both, the CV-based object detection and the RF-based AoA estimation, and discuss the applicability of the combined RF and CV pipelines in different ubiquitous computing scenarios. This research can form a starting point to spawn the integration of diverse object detection, identification, and interaction approaches that function across the electromagnetic spectrum, and beyond.

MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources / Strecker, Jannis; Akhunov, Khakim; Carbone, Federico; García, Kimberly; Bektaş, Kenan; Gomez, Andres; Mayer, Simon; Yildirim, Kasim Sinan. - In: PROCEEDINGS OF THE ACM ON INTERACTIVE, MOBILE, WEARABLE AND UBIQUITOUS TECHNOLOGIES. - ISSN 2474-9567. - 7:3(2023), pp. -26. (Intervento presentato al convegno UbiComp tenutosi a Mexico nel 8th October - 12th October) [10.1145/3610879].

MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources

Khakim Akhunov;Kasim Sinan Yildirim
2023-01-01

Abstract

The increasing number of objects in ubiquitous computing environments creates a need for effective object detection and identification mechanisms that permit users to intuitively initiate interactions with these objects. While multiple approaches to such object detection - including through visual object detection, fiducial markers, relative localization, or absolute spatial referencing - are available, each of these suffers from drawbacks that limit their applicability. In this paper, we propose ODIF, an architecture that permits the fusion of object situation information from such heterogeneous sources and that remains vertically and horizontally modular to allow extending and upgrading systems that are constructed accordingly. We furthermore present BLEARVIS, a prototype system that builds on the proposed architecture and integrates computer-vision (CV) based object detection with radio-frequency (RF) angle of arrival (AoA) estimation to identify BLE-tagged objects. In our system, the front camera of a Mixed Reality (MR) head-mounted display (HMD) provides a live image stream to a vision-based object detection module, while an antenna array that is mounted on the HMD collects AoA information from ambient devices. In this way, BLEARVIS is able to differentiate between visually identical objects in the same environment and can provide an MR overlay of information (data and controls) that relates to them. We include experimental evaluations of both, the CV-based object detection and the RF-based AoA estimation, and discuss the applicability of the combined RF and CV pipelines in different ubiquitous computing scenarios. This research can form a starting point to spawn the integration of diverse object detection, identification, and interaction approaches that function across the electromagnetic spectrum, and beyond.
2023
MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources
1601 Broadway, 10th Floor, NEW YORK, NY USA
ASSOC COMPUTING MACHINERY
Strecker, Jannis; Akhunov, Khakim; Carbone, Federico; García, Kimberly; Bektaş, Kenan; Gomez, Andres; Mayer, Simon; Yildirim, Kasim Sinan
MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources / Strecker, Jannis; Akhunov, Khakim; Carbone, Federico; García, Kimberly; Bektaş, Kenan; Gomez, Andres; Mayer, Simon; Yildirim, Kasim Sinan. - In: PROCEEDINGS OF THE ACM ON INTERACTIVE, MOBILE, WEARABLE AND UBIQUITOUS TECHNOLOGIES. - ISSN 2474-9567. - 7:3(2023), pp. -26. (Intervento presentato al convegno UbiComp tenutosi a Mexico nel 8th October - 12th October) [10.1145/3610879].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/399572
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? 0
social impact