One-Shot Real-World Demonstration Synthesis for Scalable Bimanual Manipulation
arXiv cs.RO / 4/28/2026
💬 OpinionModels & Research
Key Points
- The paper introduces BiDemoSyn, a framework for synthesizing thousands of contact-rich, physically feasible bimanual demonstrations starting from a single real-world example.
- BiDemoSyn tackles the teleoperation-vs-simulation trade-off by decomposing tasks into invariant coordination blocks and variable object-dependent adjustments, then aligning via vision and refining with lightweight trajectory optimization.
- Experiments across six dual-arm tasks show that policies trained on BiDemoSyn demonstrations generalize robustly to new object poses and shapes, outperforming recent baselines.
- The method extends beyond one-shot to few-shot synthesis to increase object-level diversity and improve out-of-distribution generalization while staying data efficient.
- The trained policies also demonstrate zero-shot cross-embodiment transfer to new robot platforms, enabled by object-centric observations and a simplified 6-DoF end-effector action representation.
Related Articles

Claude Haiku for Low-Cost AI Inference: Patterns from a Horse Racing Prediction System
Dev.to

How We Built an Ambient AI Clinical Documentation Pipeline (and Saved Doctors 8+ Hours a Week)
Dev.to

🦀 PicoClaw Deep Dive — A Field Guide to Building an Ultra-Light AI Agent in Go 🐹
Dev.to

Real-Time Monitoring for AI Agents: Beyond Log Streaming
Dev.to
Top 10 Physical AI Models Powering Real-World Robots in 2026
MarkTechPost