Out-of-Sight Embodied Agents: Multimodal Tracking, Sensor Fusion, and Trajectory Forecasting

arXiv cs.RO / 3/30/2026

💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research

Key Points

  • The paper addresses trajectory prediction under real-world sensing limits by focusing on out-of-sight agents and noisy observations from occlusions or limited camera coverage.
  • It introduces major improvements to the Out-of-Sight Trajectory (OST) task, including extending OOSTraj from pedestrians to both pedestrians and vehicles to better fit autonomous driving, robotics, and surveillance.
  • The proposed Vision-Positioning Denoising Module uses camera calibration to map visual signals to position correspondence, enabling unsupervised denoising of noisy sensor trajectories despite missing ground-truth clean trajectories.
  • Experiments on Vi-Fi and JRDB show state-of-the-art performance for both trajectory denoising and trajectory prediction, outperforming prior baselines and improving on classical approaches like Kalman filtering.
  • The authors claim this is the first work to use vision-positioning projection specifically for denoising noisy sensor trajectories of out-of-sight agents, establishing a stronger benchmark and opening new research directions.

Abstract

Trajectory prediction is a fundamental problem in computer vision, vision-language-action models, world models, and autonomous systems, with broad impact on autonomous driving, robotics, and surveillance. However, most existing methods assume complete and clean observations, and therefore do not adequately handle out-of-sight agents or noisy sensing signals caused by limited camera coverage, occlusions, and the absence of ground-truth denoised trajectories. These challenges raise safety concerns and reduce robustness in real-world deployment. In this extended study, we introduce major improvements to Out-of-Sight Trajectory (OST), a task for predicting noise-free visual trajectories of out-of-sight objects from noisy sensor observations. Building on our prior work, we expand Out-of-Sight Trajectory Prediction (OOSTraj) from pedestrians to both pedestrians and vehicles, increasing its relevance to autonomous driving, robotics, and surveillance. Our improved Vision-Positioning Denoising Module exploits camera calibration to establish vision-position correspondence, mitigating the lack of direct visual cues and enabling effective unsupervised denoising of noisy sensor signals. Extensive experiments on the Vi-Fi and JRDB datasets show that our method achieves state-of-the-art results for both trajectory denoising and trajectory prediction, with clear gains over prior baselines. We also compare with classical denoising methods, including Kalman filtering, and adapt recent trajectory prediction models to this setting, establishing a stronger benchmark. To the best of our knowledge, this is the first work to use vision-positioning projection to denoise noisy sensor trajectories of out-of-sight agents, opening new directions for future research.

Out-of-Sight Embodied Agents: Multimodal Tracking, Sensor Fusion, and Trajectory Forecasting | AI Navigate