Learning Equivariant Neural-Augmented Object Dynamics From Few Interactions

arXiv cs.RO / 5/5/2026

📰 NewsModels & Research

Key Points

  • The paper addresses the challenge of learning data-efficient object dynamics for robotic manipulation, especially for deformable objects, where purely data-driven graph models struggle with long-horizon physical feasibility.
  • It proposes PIEGraph, a hybrid framework that combines an analytical, physically informed spring–mass particle model with an equivariant graph neural network to learn motion from limited real-world interactions.
  • The approach introduces a novel action representation that leverages symmetries in particle interactions to better guide the analytical component and improve generalization.
  • Experiments in both simulation and on real robot hardware for reorientation and repositioning tasks across ropes, cloth, stuffed animals, and rigid objects show more accurate dynamics prediction and better planning performance than existing baselines.
  • Overall, PIEGraph demonstrates that integrating physics constraints with equivariant GNN learning can reduce interaction data needs while maintaining reliable manipulation planning for long-horizon tasks.

Abstract

Learning data-efficient object dynamics models for robotic manipulation remains challenging, especially for deformable objects. A popular approach is to model objects as sets of 3D particles and learn their motion using graph neural networks. In practice, this is not enough to maintain physical feasibility over long horizons and may require large amounts of interaction data to learn. We introduce PIEGraph, a novel approach to combining analytical physics and data-driven models to capture object dynamics for both rigid and deformable bodies using limited real-world interaction data. PIEGraph consists of two components: (1) a \textbf{P}hysically \textbf{I}nformed particle-based analytical model (implemented as a spring--mass system) to enforce physically feasible motion, and (2) an \textbf{E}quivariant \textbf{Graph} Neural Network with a novel action representation that exploits symmetries in particle interactions to guide the analytical model. We evaluate PIEGraph in simulation and on robot hardware for reorientation and repositioning tasks with ropes, cloth, stuffed animals and rigid objects. We show that our method enables accurate dynamics prediction and reliable downstream robotic manipulation planning, which outperforms state of the art baselines.