Intelligent Assistive Technologies (IAT) spectrum is rapidly expanding and embody smart systems supporting assistive tasks in ambient assisted living. In this field, it has been developed a virtual assistant to help mild cognitive impaired subjects to carry out elementary food preparation. The system is composed of several modules that manage supportive animations and user actions recognition to develop a personalized assistant able to interact with him considering the user status and the environmental context. Main results are: Action recognition among the reach, move, mix, tilt and grasp gestures of 85%; the development of sensor fusion module able to estimate the objects accuracies in real time taking into account the quality of fit; the development of contextualized animations that take into account objects location on the kitchen plane.
An Augmented Reality Virtual Assistant to Help Mild Cognitive Impaired Users in Cooking a System Able to Recognize the User Status and Personalize the Support / Dagostini, J.; Bonetti, L.; Salee, A.; Passerini, L.; Fiacco, G.; Lavanda, P.; Motti, E.; Stocco, M.; Gashay, K. T.; Abebe, E. G.; Alemu, S. M.; Haghani, R.; Voltolini, A.; Strobbe, C.; Covre, N.; Santolini, G.; Armellini, M.; Sacchi, T.; Ronchese, D.; Furlan, C.; Facchinato, F.; Maule, L.; Tomasin, P.; Fornaser, A.; De Cecco, M.. - (2018), pp. 12-17. (Intervento presentato al convegno 2018 Workshop on Metrology for Industry 4.0 and IoT, MetroInd 4.0 and IoT 2018 tenutosi a ita nel 2018) [10.1109/METROI4.2018.8428314].
An Augmented Reality Virtual Assistant to Help Mild Cognitive Impaired Users in Cooking a System Able to Recognize the User Status and Personalize the Support
Covre N.;Maule L.;Tomasin P.;Fornaser A.;De Cecco M.
2018-01-01
Abstract
Intelligent Assistive Technologies (IAT) spectrum is rapidly expanding and embody smart systems supporting assistive tasks in ambient assisted living. In this field, it has been developed a virtual assistant to help mild cognitive impaired subjects to carry out elementary food preparation. The system is composed of several modules that manage supportive animations and user actions recognition to develop a personalized assistant able to interact with him considering the user status and the environmental context. Main results are: Action recognition among the reach, move, mix, tilt and grasp gestures of 85%; the development of sensor fusion module able to estimate the objects accuracies in real time taking into account the quality of fit; the development of contextualized animations that take into account objects location on the kitchen plane.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione