AhaRobot: A Low-Cost Open-Source Bimanual Mobile Manipulator for Embodied AI

arXiv cs.RO / 5/6/2026

📰 NewsDeveloper Stack & InfrastructureSignals & Early TrendsTools & Practical UsageModels & Research

Key Points

  • The paper introduces AhaRobot, a fully open-source, low-cost bimanual mobile manipulator designed to reduce bottlenecks in scaling embodied AI by enabling cheaper data collection for vision-language-action manipulation.
  • It uses a SCARA-like dual-arm hardware design to lower motor torque requirements while preserving a large vertical reachable workspace for mobile manipulation tasks.
  • The control stack improves manipulation precision through dual-motor backlash mitigation and static-friction compensation via dithering.
  • A key contribution is RoboPilot’s teleoperation interface with a novel 26-faced marker handle, which reduces tracking error by 80% versus a 6-faced baseline and increases remote data-collection efficiency by 30% over long horizons.
  • Experiments report 0.7 mm repeatability at about $1,000 in total hardware cost, and the robot supports imitation learning of complex, contact-rich household behaviors with data quality comparable to VR-based collection.

Abstract

Scaling Vision-Language-Action models for embodied manipulation demands large volumes of diverse manipulation data, yet the high cost of commercial mobile manipulators and teleoperation interfaces that are difficult to deploy at scale remain key bottlenecks. We present AhaRobot, a low-cost, fully open-source bimanual mobile manipulator tailored for Embodied-AI. The system contributes: (1) a SCARA-like dual-arm hardware design that reduces motor torque demands while maintaining a large vertical reachable workspace, (2) an optimized control stack that improves precision via dual-motor backlash mitigation and static-friction compensation through dithering, and (3) RoboPilot, a teleoperation interface featuring a novel 26-faced marker handle for precise, long-horizon remote data collection. Experimental results show that our hardware-control co-design achieves 0.7 mm repeatability at a total hardware cost of only $1,000. The proposed 26-faced handle reduces tracking error by 80% over a 6-faced baseline and improves data-collection efficiency by 30%, while robustly handling singularities and supporting extremely long-horizon tasks in fully remote settings. Despite its low cost, AhaRobot enables imitation learning of complex household behaviors involving bimanual coordination, upper-body mobility, and contact-rich interaction, with data quality comparable to VR-based collection. All software, CAD files, and documentation are available at https://aha-robot.github.io.

AhaRobot: A Low-Cost Open-Source Bimanual Mobile Manipulator for Embodied AI | AI Navigate