Aung, MSH;
Katwang, S;
Romera-Paredes, B;
Martinez, B;
Singh, A;
Cella, M;
Valstar, M;
... Bianchi-Berthouze, N; + view all
(2015)
The automatic detection of chronic pain-related expression: requirements, challenges and a multimodal dataset.
IEEE Transactions on Affective Computing
, 7
(4)
pp. 435-451.
10.1109/TAFFC.2015.2462830.
Preview |
Text
taffc-aung-2462830-proof-1.pdf Download (1MB) | Preview |
Abstract
Pain-related emotions are a major barrier to effective self rehabilitation in chronic pain. Automated coaching systems capable of detecting these emotions are a potential solution. This paper lays the foundation for the development of such systems by making three contributions. First, through literature reviews, an overview of how chronic pain is expressed and the motivation for detecting it in physical rehabilitation is provided. Second, a fully labelled multimodal dataset containing high resolution multiple-view face videos, head mounted and room audio signals, full body 3-D motion capture and electromyographic signals from back muscles is supplied. Natural unconstrained pain related facial expressions and body movement behaviours were elicited from people with chronic pain carrying out physical exercises. Both instructed and non-instructed exercises where considered to reflect different rehabilitation scenarios. Two sets of labels were assigned: level of pain from facial expressions annotated by eight raters and the occurrence of six pain-related body behaviours segmented by four experts. Third, through exploratory experiments grounded in the data, the factors and challenges in the automated recognition of such expressions and behaviour are described, the paper concludes by discussing potential avenues in the context of these findings also highlighting differences for the two exercise scenarios addressed.
Archive Staff Only
View Item |