AI Navigate

Zero Shot Deformation Reconstruction for Soft Robots Using a Flexible Sensor Array and Cage Based 3D Gaussian Modeling

arXiv cs.RO / 3/23/2026

📰 NewsTools & Practical UsageModels & Research

Key Points

  • A zero-shot deformation reconstruction framework for soft robots operates without any visual supervision at inference time, relying on a static geometric proxy and real-time tactile sensing.
  • The approach combines a flexible piezoresistive sensor array with a cage-based 3D Gaussian deformation model, mapping local tactile measurements to cage control signals that drive dense Gaussian primitives for global deformations.
  • A graph attention network regresses cage displacements from tactile input, enforcing spatial smoothness and boundary-aware propagation to enable generalization to unseen soft robots in bending and twisting.
  • The system achieves IoU 0.67, SSIM 0.65, and Chamfer distance 3.48 mm while rendering photorealistic RGB in real time, demonstrating strong zero-shot generalization through tactile-geometry coupling.

Abstract

We present a zero-shot deformation reconstruction framework for soft robots that operates without any visual supervision at inference time. In this work, zero-shot deformation reconstruction is defined as the ability to infer object-wide deformations on previously unseen soft robots without collecting object-specific deformation data or performing any retraining during deployment. Our method assumes access to a static geometric proxy of the undeformed object, which can be obtained from a STL model. During operation, the system relies exclusively on tactile sensing, enabling camera-free deformation inference. The proposed framework integrates a flexible piezoresistive sensor array with a geometry-aware, cage-based 3D Gaussian deformation model. Local tactile measurements are mapped to low-dimensional cage control signals and propagated to dense Gaussian primitives to generate globally consistent shape deformations. A graph attention network regresses cage displacements from tactile input, enforcing spatial smoothness and structural continuity via boundary-aware propagation. Given only a nominal geometric proxy and real-time tactile signals, the system performs zero-shot deformation reconstruction of unseen soft robots in bending and twisting motions, while rendering photorealistic RGB in real time. It achieves 0.67 IoU, 0.65 SSIM, and 3.48 mm Chamfer distance, demonstrating strong zero-shot generalization through explicit coupling of tactile sensing and structured geometric deformation.