Rivas, J;
Lara, PC;
Castrejon, L;
Franco, JH;
Orihuela-Espina, F;
Palafox, L;
Williams, ACDC;
... Sucar, LE; + view all
(2021)
Multi-label and multimodal classifier for affectve states recognition in virtual rehabilitation.
IEEE Transactions on Affective Computing
(In press).
Preview |
Text
MLR_AS_ver05.pdf - Accepted Version Download (15MB) | Preview |
Abstract
Computational systems that process multiple affective states may benefit from explicitly considering the interaction between the states to enhance their recognition performance. This work proposes the combination of a multi-label classifier, Circular Classifier Chain (CCC), with a multimodal classifier, Fusion using a Semi-Naive Bayesian classifier (FSNBC), to include explicitly the dependencies between multiple affective states during the automatic recognition process. This combination of classifiers is applied to a virtual rehabilitation context of post-stroke patients. We collected data from post-stroke patients, which include finger pressure, hand movements, and facial expressions during ten longitudinal sessions. Videos of the sessions were labelled by clinicians to recognize four states: tiredness, anxiety, pain, and engagement. Each state was modelled by the FSNBC receiving the information of finger pressure, hand movements, and facial expressions. The four FSNBCs were linked in the CCC to exploit the dependency relationships between the states. The convergence of CCC was reached by 5 iterations at most for all the patients. Results (ROC AUC) of CCC with the FSNBC are over 0.940 ± 0.045 (mean ± std. deviation) for the four states. Relationships of mutual exclusion between engagement and all the other states and co-occurrences between pain and anxiety were detected and discussed.
Archive Staff Only
View Item |