Modern applications in virtual reality require a high level of fruition of the environment as if it was real. In applications that have to deal with real scenarios, it is important to acquire both its three-dimensional (3D) structure and details to enable the users to achieve good immersive experiences. The purpose of this paper is to illustrate a method to obtain a mesh with high quality texture combining a raw 3D mesh model of the environment and 360 ° images. The main outcome is a mesh with a high level of photorealistic details. This enables both a good depth perception thanks to the mesh model and high visualization quality thanks to the 2D resolution of modern omnidirectional cameras. The fundamental step to reach this goal is the correct alignment between the 360 ° camera and the 3D mesh model. For this reason, we propose a method that embodies two steps: 1) find the 360 ° cameras pose within the current 3D environment; 2) project the high-quality 360 ° image on top of the mesh. After the method description, we outline its validation in two virtual reality scenarios, a mine and city environment, respectively, which allows us to compare the achieved results with the ground truth.

Omnidirectional camera pose estimation and projective texture mapping for photorealistic 3D virtual reality experiences / Luchetti, Alessandro; Zanetti, Matteo; Kalkofen, Denis; De Cecco, Mariolino. - In: ACTA IMEKO. - ISSN 2221-870X. - 11:2(2022), pp. 1-8. [10.21014/acta_imeko.v11i2.1127]

Omnidirectional camera pose estimation and projective texture mapping for photorealistic 3D virtual reality experiences

Luchetti, Alessandro
Primo
;
Zanetti, Matteo
Secondo
;
De Cecco, Mariolino
Ultimo
2022-01-01

Abstract

Modern applications in virtual reality require a high level of fruition of the environment as if it was real. In applications that have to deal with real scenarios, it is important to acquire both its three-dimensional (3D) structure and details to enable the users to achieve good immersive experiences. The purpose of this paper is to illustrate a method to obtain a mesh with high quality texture combining a raw 3D mesh model of the environment and 360 ° images. The main outcome is a mesh with a high level of photorealistic details. This enables both a good depth perception thanks to the mesh model and high visualization quality thanks to the 2D resolution of modern omnidirectional cameras. The fundamental step to reach this goal is the correct alignment between the 360 ° camera and the 3D mesh model. For this reason, we propose a method that embodies two steps: 1) find the 360 ° cameras pose within the current 3D environment; 2) project the high-quality 360 ° image on top of the mesh. After the method description, we outline its validation in two virtual reality scenarios, a mine and city environment, respectively, which allows us to compare the achieved results with the ground truth.
2022
2
Luchetti, Alessandro; Zanetti, Matteo; Kalkofen, Denis; De Cecco, Mariolino
Omnidirectional camera pose estimation and projective texture mapping for photorealistic 3D virtual reality experiences / Luchetti, Alessandro; Zanetti, Matteo; Kalkofen, Denis; De Cecco, Mariolino. - In: ACTA IMEKO. - ISSN 2221-870X. - 11:2(2022), pp. 1-8. [10.21014/acta_imeko.v11i2.1127]
File in questo prodotto:
File Dimensione Formato  
2 1127-Article Text-8809-1-10-20220712.pdf

accesso aperto

Tipologia: Versione editoriale (Publisher’s layout)
Licenza: Creative commons
Dimensione 1.38 MB
Formato Adobe PDF
1.38 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11572/392893
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
  • OpenAlex ND
social impact