The InterHand2.6M dataset is a large-scale real-captured dataset with accurate GT 3D interacting hand poses, used for 3D hand pose estimation The dataset contains 2.6M labeled single and interacting hand frames.
Variants: InterHand2.6M
This dataset is used in 2 benchmarks:
Task | Model | Paper | Date |
---|---|---|---|
3D Interacting Hand Pose Estimation | VM-BHINet | VM-BHINet:Vision Mamba Bimanual Hand Interaction … | 2025-04-20 |
3D Interacting Hand Pose Estimation | EANet | Extract-and-Adaptation Network for 3D Interacting … | 2023-09-05 |
3D Interacting Hand Pose Estimation | ACR | ACR: Attention Collaboration-based Regressor for … | 2023-03-10 |
3D Interacting Hand Pose Estimation | DIR | Decoupled Iterative Refinement Framework for … | 2023-02-05 |
3D Interacting Hand Pose Estimation | IntagHand | Interacting Attention Graph for Single … | 2022-03-17 |
3D Interacting Hand Pose Estimation | DIGIT | Learning to Disambiguate Strongly Interacting … | 2021-07-01 |
3D Interacting Hand Pose Estimation | Keypoint Transformer | Keypoint Transformer: Solving Joint Identification … | 2021-04-29 |
3D Interacting Hand Pose Estimation | InterNet | InterHand2.6M: A Dataset and Baseline … | 2020-08-21 |
3D Hand Pose Estimation | Epipolar Transformers | Epipolar Transformers | 2020-05-10 |
Recent papers with results on this dataset: