HRDexDB: A Large-Scale Dataset of Dexterous Human and Robotic Hand Grasps
arXiv cs.RO / 4/17/2026
📰 NewsSignals & Early TrendsModels & Research
Key Points
- HRDexDB introduces a large-scale, multi-modal dataset of high-fidelity dexterous grasping sequences collected from both human hands and diverse robotic hands.
- The dataset spans 100 objects and provides detailed grasping trajectories with high-precision spatiotemporal 3D ground-truth for both the agent and the manipulated objects.
- It adds synchronized multi-view video, egocentric video streams, and high-resolution tactile signals to support research into physical interaction.
- HRDexDB contains 1.4K grasping trials (including both successes and failures), packaged with aligned visual, kinematic, and tactile modalities for benchmarking and policy learning.
- By capturing human dexterity and robot execution on the same objects under comparable grasping motions, HRDexDB is positioned as a foundational benchmark for multimodal policy learning and cross-domain dexterous manipulation.
💡 Insights using this article
This article is featured in our daily AI news digest — key takeaways and action items at a glance.



![[2026] OpenTelemetry for LLM Observability — Self-Hosted Setup](/_next/image?url=https%3A%2F%2Fmedia2.dev.to%2Fdynamic%2Fimage%2Fwidth%3D1200%2Cheight%3D627%2Cfit%3Dcover%2Cgravity%3Dauto%2Cformat%3Dauto%2Fhttps%253A%252F%252Fdev-to-uploads.s3.amazonaws.com%252Fuploads%252Farticles%252Flu4b6ttuhur71z5gemm0.png&w=3840&q=75)