UCL Discovery
UCL home » Library Services » Electronic resources » UCL Discovery

HUP-3D: A 3D multi-view synthetic dataset for assisted-egocentric hand-ultrasound-probe pose estimation

Birlo, Manuel; Caramalau, Razvan; Edwards, Philip J Eddie; Dromey, Brian; Clarkson, Matthew J; Stoyanov, Danail; (2024) HUP-3D: A 3D multi-view synthetic dataset for assisted-egocentric hand-ultrasound-probe pose estimation. In: Medical Image Computing and Computer Assisted Intervention – MICCAI 2024. (pp. pp. 430-436). Springer, Cham

[thumbnail of 1531_paper.pdf] Text
1531_paper.pdf - Accepted Version
Access restricted to UCL open access staff until 4 October 2025.

Download (906kB)

Abstract

We present HUP-3D, a 3D multiview multimodal synthetic dataset for hand ultrasound (US) probe pose estimation in the context of obstetric ultrasound. Egocentric markerless 3D joint pose estimation has potential applications in mixed reality medical education. The ability to understand hand and probe movements opens the door to tailored guidance and mentoring applications. Our dataset consists of over 31k sets of RGB, depth, and segmentation mask frames, including pose-related reference data, with an emphasis on image diversity and complexity. Adopting a camera viewpoint-based sphere concept allows us to capture a variety of views and generate multiple hand grasps poses using a pre-trained network. Additionally, our approach includes a software-based image rendering concept, enhancing diversity with various hand and arm textures, lighting conditions, and background images. We validated our proposed dataset with state-of-the-art learning models and we obtained the lowest hand-object keypoint errors. The supplementary material details the parameters for sphere-based camera view angles and the grasp generation and rendering pipeline configuration. The source code for our grasp generation and rendering pipeline, along with the dataset, is publicly available at https://manuelbirlo.github.io/HUP-3D/.

Type: Proceedings paper
Title: HUP-3D: A 3D multi-view synthetic dataset for assisted-egocentric hand-ultrasound-probe pose estimation
Event: Medical Image Computing and Computer Assisted Intervention -- MICCAI 2024
Location: Marrakesh, Morroco
Dates: 7 Oct 2024 - 10 Oct 2024
ISBN-13: 9783031723773
DOI: 10.1007/978-3-031-72378-0_40
Publisher version: http://dx.doi.org/10.1007/978-3-031-72378-0_40
Language: English
Additional information: This version is the author accepted manuscript. For information on re-use, please refer to the publisher’s terms and conditions.
Keywords: Egocentric 3D joint hand and tool pose estimation, Synthetic datasets, Obstetrics ultrasound
UCL classification: UCL
UCL > Provost and Vice Provost Offices > UCL BEAMS
UCL > Provost and Vice Provost Offices > UCL BEAMS > Faculty of Engineering Science > Dept of Computer Science
URI: https://discovery.ucl.ac.uk/id/eprint/10198544
Downloads since deposit
Loading...
1Download
Download activity - last month
Loading...
Download activity - last 12 months
Loading...
Downloads by country - last 12 months
Loading...

Archive Staff Only

View Item View Item