Open-H-Embodiment: A Large-Scale Dataset for Enabling Foundation Models in Medical Robotics
arXiv cs.RO / 4/24/2026
📰 NewsSignals & Early TrendsIndustry & Market MovesModels & Research
Key Points
- The paper introduces Open-H-Embodiment, a large open dataset of medical-robotics video with synchronized kinematics collected across more than 49 institutions and multiple robot platforms.
- The dataset covers several procedure types, including surgical manipulation, robotic ultrasound, and endoscopy, addressing limitations of prior medical robotics datasets being small, single-embodiment, and not openly shared.
- The authors demonstrate research enabled by the dataset by training two foundation models, including GR00T-H, a vision-language-action model evaluated on a suturing benchmark.
- GR00T-H shows the only full end-to-end task completion on the structured suturing benchmark and reports strong average success over a 29-step ex vivo suturing sequence.
- They also train Cosmos-H-Surgical-Simulator, an action-conditioned world model that supports multi-embodiment surgical simulation from a single checkpoint and enables in-silico policy evaluation and synthetic data generation.
Related Articles

Black Hat USA
AI Business

Context Engineering for Developers: A Practical Guide (2026)
Dev.to

GPT-5.5 is here. So is DeepSeek V4. And honestly, I am tired of version numbers.
Dev.to
AI Visibility Tracking Exploded in 2026: 6 Tools Every Brand Needs Now
Dev.to

I Built an AI Image Workflow with GPT Image 2.0 (+ Fixing Its Biggest Flaw)
Dev.to