Hevesi, P;
Ward, JA;
Amiraslanov, O;
Pirkl, G;
Lukowicz, P;
(2018)
Wearable Eye Tracking for Multisensor Physical Activity Recognition.
International Journal On Advances in Intelligent Systems
, 10
(1-2)
pp. 103-116.
Preview |
Text
Ward_Wearable eye tracking for multisensor physical activity recognition_AAM.pdf - Accepted Version Download (11MB) | Preview |
Abstract
This paper explores the use of wearable eye-tracking to detect physical activities and location information during assembly and construction tasks involving small groups of up to four people. Large physical activities, like carrying heavy items and walking, are analysed alongside more precise, hand-tool activities, like using a drill, or a screwdriver. In a first analysis, gazeinvariant features from the eye-tracker are classified (using Naive Bayes) alongside features obtained from wrist-worn accelerometers and microphones. An evaluation is presented using data from an 8-person dataset containing over 600 physical activity events, performed under real-world (noisy) conditions. Despite the challenges of working with complex, and sometimes unreliable, data we show that event-based precision and recall of 0.66 and 0.81 respectively can be achieved by combining all three sensing modalities (using experiment independent training, and temporal smoothing). In a further analysis, we apply state-ofthe-art computer vision methods like object recognition, scene recognition, and face detection, to generate features from the eye-trackers’ egocentric videos. Activity recognition trained on the output of an object recognition model (e.g., VGG16 trained on ImageNet) could predict Precise activities with an (overall average) f-measure of 0.45. Location of participants was similarly obtained using visual scene recognition, with average precision and recall of 0.58 and 0.56.
Archive Staff Only
View Item |