A deep learning approach to automatically extract 3d hand measurements

Abstract

Accurate hand measurement data is of crucial importance in medical science, fashion industry, and augmented/virtual reality applications. Conventional methods extract the hand measurements manually using a measuring tape, thereby being very time-consuming and yielding unreliable measurements. In this paper, we propose–to the best of our knowledge–the first deep-learning-based method to automatically measure the hand in a non-contact manner from a single 3D hand scan. The proposed method employs a 3D hand scan, extracts the features, reconstructs the hand by making use of a 3D hand template, transfers the measurements defined on the template and extracts them from the reconstructed hand. In order to train, validate, and test the method, a novel large-scale synthetic hand dataset is generated. The results on both the unseen synthetic data and the unseen real scans captured by the Occipital structure sensor Mark I demonstrate that the proposed method outperforms the state-of-the-art method in most hand measurement types.

Publication
ACM International Conference on Machine Learning Technologies (ICMLT)
Pengpeng Hu
Pengpeng Hu
Senior Lecturer (Associate Professor)

Pengpeng Hu is currently a Senior Lecturer (Associate Professor) with The University of Manchester. His research interests include biometrics, geometric deep learning, 3D human body reconstruction, point cloud processing, and vision-based measurement. He serves as an Associate Editor for IEEE Transactions on Neural Networks and Learning Systems, IEEE Transactions on Automation Science and Engineering, and Engineering and Mathematics in Medical and Life Sciences, as well as an Academic Editor for PLOS ONE and a member of the editorial board for Scientific Reports. He is also the Programme Chair for the 25th UK Workshop on Computational Intelligence (UKCI 2026) and an Area Chair for the 35th British Machine Vision Conference (BMVC 2024). He is the recipient of the Emerald Literati Award for an outstanding paper in 2019.