Bayesian Active Object Recognition and 6D Pose Estimation from Multimodal Contact Sensing

arXiv cs.RO / 3/24/2026

📰 News

Key Points

  • The paper proposes an active tactile exploration framework that jointly performs object recognition and 6D pose estimation using a Bayesian belief over both object class and pose.

Abstract

We present an active tactile exploration framework for joint object recognition and 6D pose estimation. The proposed method integrates wrist force/torque sensing, GelSight tactile sensing, and free-space constraints within a Bayesian inference framework that maintains a belief over object class and pose during active tactile exploration. By combining contact and non-contact evidence, the framework reduces ambiguity and improves robustness in the joint class-pose estimation problem. To enable efficient inference in the large hypothesis space, we employ a customized particle filter that progressively samples particles based on new observations. The inferred belief is further used to guide active exploration by selecting informative next touches under reachability constraints. For effective data collection, a motion planning and control framework is developed to plan and execute feasible paths for tactile exploration, handle unexpected contacts and GelSight-surface alignment with tactile servoing. We evaluate the framework in simulation and on a Franka Panda robot using 11 YCB objects. Results show that incorporating tactile and free-space information substantially improves recognition and pose estimation accuracy and stability, while reducing the number of action cycles compared with force/torque-only baselines. Code, dataset, and supplementary material will be made available online.

Bayesian Active Object Recognition and 6D Pose Estimation from Multimodal Contact Sensing | AI Navigate