Afficher la notice abrégée

Protocol for the Development of Automatic Multisensory Systems to Analyze Human Activity for Functional Evaluation: Application to the EYEFUL System

dc.contributor.authorObeso-Benítez, Paula
dc.contributor.authorPérez-de-Heredia-Torres, Marta
dc.contributor.authorHuertas-Hoyas, Elisabet
dc.contributor.authorSánchez-Herrera-Baeza, Patricia
dc.contributor.authorMáximo-Bocanegra, Nuria
dc.contributor.authorSerrada-Tejeda, Sergio
dc.contributor.authorMarron-Romera, Marta
dc.contributor.authorMacias-Guarasa, Javier
dc.contributor.authorLosada-Gutierrez, Cristina
dc.contributor.authorPalazuelos-Cagigas, Sira E.
dc.contributor.authorMartin-Sanchez, Jose L.
dc.contributor.authorMartínez-Piédrola, Rosa
dc.date.accessioned2024-04-18T12:26:44Z
dc.date.available2024-04-18T12:26:44Z
dc.date.issued2024-04-18
dc.identifier.citationObeso-Benítez, P.; Pérez-de-Heredia-Torres, M.; Huertas-Hoyas, E.; Sánchez-Herrera-Baeza, P.; Máximo-Bocanegra, N.; Serrada-Tejeda, S.; Marron-Romera, M.; Macias-Guarasa, J.; Losada-Gutierrez, C.; Palazuelos-Cagigas, S.E.; et al. Protocol for the Development of Automatic Multisensory Systems to Analyze Human Activity for Functional Evaluation: Application to the EYEFUL System. Appl. Sci. 2024, 14, 3415. https://doi.org/10.3390/app14083415es
dc.identifier.issn2076-3417
dc.identifier.urihttps://hdl.handle.net/10115/32386
dc.description.abstractThe EYEFUL system represents a pioneering initiative designed to leverage multisensory systems for the automatic evaluation of functional ability and determination of dependency status in people performing activities of daily living. This interdisciplinary effort, bridging the gap between engineering and health sciences, aims to overcome the limitations of current evaluation tools, which often lack objectivity and fail to capture the full range of functional capacity. Until now, it has been derived from subjective reports and observational methods. By integrating wearable sensors and environmental technologies, EYEFUL offers an innovative approach to quantitatively assess an individual’s ability to perform activities of daily living, providing a more accurate and unbiased evaluation of functionality and personal independence. This paper describes the protocol planned for the development of the EYEFUL system, from the initial design of the methodology to the deployment of multisensory systems and the subsequent clinical validation process. The implications of this research are far-reaching, offering the potential to improve clinical evaluations of functional ability and ultimately improve the quality of life of people with varying levels of dependency. With its emphasis on technological innovation and interdisciplinary collaboration, the EYEFUL system sets a new standard for objective evaluation, highlighting the critical role of advanced screening technologies in addressing the challenges of modern healthcare. We expect that the publication of the protocol will help similar initiatives by providing a structured approach and rigorous validation process.es
dc.language.isoenges
dc.publisherMDPIes
dc.rightsAttribution 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.subjectdependencyes
dc.subjectactivities of daily living;es
dc.subjectwearable sensors;es
dc.subjecthuman activity recognition;es
dc.subjectmultisensoryes
dc.subjectfunctional;es
dc.subjectevaluation systemses
dc.titleProtocol for the Development of Automatic Multisensory Systems to Analyze Human Activity for Functional Evaluation: Application to the EYEFUL Systemes
dc.typeinfo:eu-repo/semantics/articlees
dc.identifier.doi10.3390/app14083415es
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses


Fichier(s) constituant ce document

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée

Attribution 4.0 InternationalExcepté là où spécifié autrement, la license de ce document est décrite en tant que Attribution 4.0 International