UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

A Bilingual Social Robot with Sign Language and Natural Language

Hei, Xiaoxuan; Yu, Chuang; Zhang, Heng; Tapus, Adriana; (2024) A Bilingual Social Robot with Sign Language and Natural Language. In: HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction. (pp. pp. 526-529). ACM (Association for Computing Machinery): New York, NY, USA. Green open access

[thumbnail of HRI_2024_LBR___A_bilingual_Social_Robot.pdf]
Preview
Text
HRI_2024_LBR___A_bilingual_Social_Robot.pdf - Other

Download (6MB) | Preview

Abstract

In situations where both deaf and non-deaf individuals are present in a public setting, it would be advantageous for a robot to communicate using both sign and natural languages simultaneously. This would not only address the needs for diverse users but also pave the way for a richer and more inclusive spectrum of human-robot interactions. To achieve this, a framework for a bilingual robot has been proposed in this paper. The robot exhibits the ability to articulate messages in spoken language, complemented by non-verbal cues such as expressive gestures, all while concurrently conveying information through sign language. The system can generate natural language expressions with speech audio, spontaneous prosody-based gestures, and sign language displayed on a virtual avatar on a robot's screen. The preliminary findings from this research showcase the robot's capacity to seamlessly blend natural language expressions with synchronized gestures and sign language, underlining its potential to revolutionize communication dynamics in diverse settings.

Type: Proceedings paper
Title: A Bilingual Social Robot with Sign Language and Natural Language
Event: HRI '24: ACM/IEEE International Conference on Human-Robot Interaction
ISBN-13: 9798400703232
Open access status: An open access version is available from UCL Discovery
DOI: 10.1145/3610978.3640549
Publisher version: https://doi.org/10.1145/3610978.3640549
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Human-robot interaction, sign language, gesture generation, virtual agent
UCL classification: UCL
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences > Div of Psychology and Lang Sciences
UCL > Provost and Vice Provost Offices > School of Life and Medical Sciences > Faculty of Brain Sciences > Div of Psychology and Lang Sciences > UCL Interaction Centre
URI: https://discovery.ucl.ac.uk/id/eprint/10194650
Downloads since deposit
11Downloads
Download activity - last month
Download activity - last 12 months
Downloads by country - last 12 months

Archive Staff Only

View Item View Item