Building High-fidelity Human Body Models from User-generated Data
We propose a key point-based approach, refers to as KPhub-PC, to estimate high-fidelity human body models from low quality point clouds acquired with an affordable 3D scanner and a variation KPhub-I that can achieve the same purpose based on low-resolution single images taken by smartphones. In KPhub-PC, a sparse set of key points is annotated to guide the deformation of a parametric 3D human body model SMPL and then a high-fidelity human body model that can explain the target point cloud is built. Besides building 3D human body models from point clouds, KPhub-I is designed to estimate accurate 3D human body models from single 2D images. The SMPL model is fitted to 2D joints and the boundary of the human body which are detected using CNN based methods automatically. Considering that people are in stable poses at most of the time, a stable pose prior is defined from CMU motion capture dataset for further improving accuracy. Intensive experiments demonstrate that in both types of user-generated data, the proposed approaches can build believable and animatable human body models robustly. Our approach outperforms the state-of-the-arts in the accuracy of both human body shape and pose estimation.