O-ConNet: Geometry-Aware End-to-End Inference of Over-Constrained Spatial Mechanisms

arXiv cs.RO / 4/3/2026

📰 News

Key Points

  • The paper introduces O-ConNet, a geometry-aware end-to-end deep learning framework for inferring structural parameters of spatial over-constrained rigid-body mechanisms from only three sparse reachable points.
  • Unlike approaches that explicitly solve constraint equations at inference time, O-ConNet reconstructs the full motion trajectory implicitly via learned representations while preserving closed-loop geometric structure.
  • Evaluated on a self-constructed Bennett 4R dataset (42,860 valid samples), O-ConNet reports Param-MAE of 0.276 ± 0.077 and Traj-MAE of 0.145 ± 0.018 across 10 runs.
  • The authors state that O-ConNet outperforms the strongest sequence baseline (LSTM-Seq2Seq) by 65.1% for parameter prediction and 88.2% for trajectory prediction.
  • The results indicate that end-to-end learning may enable practical inverse design of over-constrained spatial mechanisms under extremely sparse observations.
  • categories: [

Abstract

Deep learning has shown strong potential for scientific discovery, but its ability to model macroscopic rigid-body kinematic constraints remains underexplored. We study this problem on spatial over-constrained mechanisms and propose O-ConNet, an end-to-end framework that infers mechanism structural parameters from only three sparse reachable points while reconstructing the full motion trajectory, without explicitly solving constraint equations during inference. On a self-constructed Bennett 4R dataset of 42,860 valid samples, O-ConNet achieves Param-MAE 0.276 +/- 0.077 and Traj-MAE 0.145 +/- 0.018 (mean +/- std over 10 runs), outperforming the strongest sequence baseline (LSTM-Seq2Seq) by 65.1 percent and 88.2 percent, respectively. These results suggest that end-to-end learning can capture closed-loop geometric structure and provide a practical route for inverse design of spatial over-constrained mechanisms under extremely sparse observations.