DIREG3D: DIrectly REGress 3D Hands from Multiple Cameras

01/26/2022
by   Ashar Ali, et al.
3

In this paper, we present DIREG3D, a holistic framework for 3D Hand Tracking. The proposed framework is capable of utilizing camera intrinsic parameters, 3D geometry, intermediate 2D cues, and visual information to regress parameters for accurately representing a Hand Mesh model. Our experiments show that information like the size of the 2D hand, its distance from the optical center, and radial distortion is useful for deriving highly reliable 3D poses in camera space from just monocular information. Furthermore, we extend these results to a multi-view camera setup by fusing features from different viewpoints.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset