TAMEn: Tactile-Aware Manipulation Engine for Closed-Loop Data Collection in Contact-Rich Tasks

arXiv cs.RO / 4/9/2026

📰 NewsSignals & Early TrendsModels & Research

Key Points

  • The paper introduces TAMEn, a tactile-aware manipulation engine designed for closed-loop data collection in contact-rich, bimanual robotic tasks where existing handheld methods struggle with hardware adaptability and data quality.
  • TAMEn uses a cross-morphology wearable interface to rapidly adapt across heterogeneous grippers and combines two data-collection modes: motion-capture precision mode and VR-based portable mode for in-the-wild acquisition with tactile-visualized recovery.
  • It implements feasibility-aware online checking during demonstration to improve replayability and to enable collection of interactive recovery data that better reflects authentic tactile signals.
  • The approach unifies large-scale tactile pretraining data, task-specific bimanual demonstrations, and human-in-the-loop recovery data into a pyramid-structured dataset regime for closed-loop policy refinement.
  • Experiments report large gains, increasing task success rates from 34% to 75%, and the authors open-source the hardware and dataset to support reproducibility in visuo-tactile manipulation research.

Abstract

Handheld paradigms offer an efficient and intuitive way for collecting large-scale demonstration of robot manipulation. However, achieving contact-rich bimanual manipulation through these methods remains a pivotal challenge, which is substantially hindered by hardware adaptability and data efficacy. Prior hardware designs remain gripper-specific and often face a trade-off between tracking precision and portability. Furthermore, the lack of online feasibility checking during demonstration leads to poor replayability. More importantly, existing handheld setups struggle to collect interactive recovery data during robot execution, lacking the authentic tactile information necessary for robust policy refinement. To bridge these gaps, we present TAMEn, a tactile-aware manipulation engine for closed-loop data collection in contact-rich tasks. Our system features a cross-morphology wearable interface that enables rapid adaptation across heterogeneous grippers. To balance data quality and environmental diversity, we implement a dual-modal acquisition pipeline: a precision mode leveraging motion capture for high-fidelity demonstrations, and a portable mode utilizing VR-based tracking for in-the-wild acquisition and tactile-visualized recovery teleoperation. Building on this hardware, we unify large-scale tactile pretraining, task-specific bimanual demonstrations, and human-in-the-loop recovery data into a pyramid-structured data regime, enabling closed-loop policy refinement. Experiments show that our feasibility-aware pipeline significantly improves demonstration replayability, and that the proposed visuo-tactile learning framework increases task success rates from 34% to 75% across diverse bimanual manipulation tasks. We further open-source the hardware and dataset to facilitate reproducibility and support research in visuo-tactile manipulation.