UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

Characterizing urban soundscapes via facial action coding system

Hu, Xuejun; Meng, Qi; Yang, Da; Li, Mengmeng; Kang, Jian; (2025) Characterizing urban soundscapes via facial action coding system. Sustainable Cities and Society , 134 , Article 106918. 10.1016/j.scs.2025.106918.

[thumbnail of Whiterose-Characterizing urban soundscapes via facial action coding system.pdf] Text
Whiterose-Characterizing urban soundscapes via facial action coding system.pdf - Accepted Version
Access restricted to UCL open access staff until 20 October 2026.

Download (2MB)

Abstract

The Facial Action Coding System (FACS) classifies facial expressions into separate Action Units (AUs), which can reflect the perception of the sound environment more accurately and in real time than facial expression classification, and facilitates the interfacing of smart city systems. The purpose of this study is to aim to explore the facial expression AU soundscape evaluation method. The study captured typical urban sound environments and recorded the participants' facial expression data in a laboratory setting. Analysis of skin conductance levels (SCL) showed significant correlations with AU activation, indicating that AU changes reflect genuine physiological arousal triggered by acoustic environments. The activation of the AUs in the natural, mechanical and humanistic sound environments corresponded to different AUs, respectively, where the natural sound environment corresponded to AU01, 04, 15, 17, 18, and 25, and the mechanical sound environment corresponded to, AU01, 12, 15, 17, and 25. The intensities of these AUs also differed significantly in the temporal dimensions. Aus can predict participants' overall assessment of the surrounding sound environment as well as establish new evaluation methods as mood variables. The links with psychoacoustic and directional features further demonstrated their sensitivity to acoustic characteristics. The inclusion of illumination correction enhanced the robustness and applicability of AU measurement. The study introduces an analytical framework linking facial expression data with urban soundscape management systems to complement and cross-validate subjective self-reports, offering an objective micro-expression layer for real-time soundscape assessment.

Type: Article
Title: Characterizing urban soundscapes via facial action coding system
DOI: 10.1016/j.scs.2025.106918
Publisher version: https://doi.org/10.1016/j.scs.2025.106918
Language: English
Additional information: This version is the author-accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Facial expression recognition, Urban soundscape, Smart City, FACS
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of the Built Environment > Bartlett School Env, Energy and Resources
URI: https://discovery.ucl.ac.uk/id/eprint/10217653
Downloads since deposit
1Download
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item