This paper presents an Augmented Reality (AR) framework able to measure, assess and train end-user performances in occupational therapy and/or home living environment taking into account a set of multidimensional performance indicators in a shared AR environment between a therapist and an end-user. The framework can be used for rehabilitation purposes or in Ambient Assisted Living (AAL) context where fragile users can be guided and monitored (i.e., their caregiver). The system includes a set of distributed sensors that are preprocessed and proposed in AR to the therapist/caregiver who can interact in the same real and virtual environments during patient/end-user assessment/support. The framework was designed in collaboration with the clinical experts of Villa Rosa rehabilitation hospital in Pergine (Italy) to meet all the relevant requirements from the point of view of the therapist (such as interfaces options, evaluation parameters) and a fragile end-user (such as characteristics and positions of objects with respect to the field of view). We developed a specific activity of daily living (ADL) of setting a table in a kitchen. Both the therapist and the user wear a Microsoft HoloLens 2. The therapist is asked to set the table with virtual objects. The user next has to set the table, this time with real objects, trying to match them as closely as possible to the corresponding virtual models. Once the exercise is completed, thanks to a developed computer vision-based algorithm for object segmentation, localization, and identification, the therapist can automatically perceive in AR the error made by the user for each object and the elapsed time. The localization algorithm performance was optimized through a calibration process that achieved a level of accuracy of 5 mm with a confidence level of 95% and residuals between the rotations of the objects estimated by the algorithm and the reference rotations with values less than 1 degrees. In conclusion, the proposed framework is suitable for the metrological assessment of ADLs in living environments.
Multidimensional assessment of daily living activities in a shared Augmented Reality environment / Luchetti, A.; Butaslac, I.; Rosi, M.; Fruet, D.; Nollo, G.; Ianes, P. G.; Pilla, F.; Gasperini, B.; Achille Guandalini, G. M.; Bonavita, J.; Kato, H.; Cecco, M. D.. - i:(2022), pp. 60-65. (Intervento presentato al convegno MetroLivEn tenutosi a Cosenza (Italy) nel Wed, May 25, 2022 – Fri, May 27, 2022) [10.1109/MetroLivEnv54405.2022.9826952].
Multidimensional assessment of daily living activities in a shared Augmented Reality environment
Luchetti, A.;Fruet, D.;Nollo, G.;
2022-01-01
Abstract
This paper presents an Augmented Reality (AR) framework able to measure, assess and train end-user performances in occupational therapy and/or home living environment taking into account a set of multidimensional performance indicators in a shared AR environment between a therapist and an end-user. The framework can be used for rehabilitation purposes or in Ambient Assisted Living (AAL) context where fragile users can be guided and monitored (i.e., their caregiver). The system includes a set of distributed sensors that are preprocessed and proposed in AR to the therapist/caregiver who can interact in the same real and virtual environments during patient/end-user assessment/support. The framework was designed in collaboration with the clinical experts of Villa Rosa rehabilitation hospital in Pergine (Italy) to meet all the relevant requirements from the point of view of the therapist (such as interfaces options, evaluation parameters) and a fragile end-user (such as characteristics and positions of objects with respect to the field of view). We developed a specific activity of daily living (ADL) of setting a table in a kitchen. Both the therapist and the user wear a Microsoft HoloLens 2. The therapist is asked to set the table with virtual objects. The user next has to set the table, this time with real objects, trying to match them as closely as possible to the corresponding virtual models. Once the exercise is completed, thanks to a developed computer vision-based algorithm for object segmentation, localization, and identification, the therapist can automatically perceive in AR the error made by the user for each object and the elapsed time. The localization algorithm performance was optimized through a calibration process that achieved a level of accuracy of 5 mm with a confidence level of 95% and residuals between the rotations of the objects estimated by the algorithm and the reference rotations with values less than 1 degrees. In conclusion, the proposed framework is suitable for the metrological assessment of ADLs in living environments.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione