Motion-Driven Multi-Object Tracking of Model Organisms in Space Science Experiments

arXiv cs.CV / 4/30/2026

📰 NewsDeveloper Stack & InfrastructureModels & Research

Key Points

  • The paper addresses the challenge of long-term, interpretable multi-animal tracking in space-science videos, where weak appearance cues, low imaging quality, complex maneuvers, and frequent interactions make identity preservation difficult.
  • It introduces the SpaceAnimal-MOT dataset for microgravity biological videos, designed to quantify motion complexity and long-term identity tracking difficulties.
  • The authors propose ART-Track (Adaptive Robust Tracking), a motion-driven framework that uses multi-model motion estimation, motion-state-based data association, and uncertainty-adaptive fusion to better handle nonlinear and abrupt motion as well as dense interactions.
  • Experiments on zebrafish and fruitfly sequences show that ART-Track substantially reduces identity switches and improves association stability under occlusion, deformation, and high-density interactions.
  • The project code is publicly released on GitHub, enabling replication and further research on space-science multi-object tracking for model organisms.

Abstract

Automated animal behavior analysis relies on long-term, interpretable individual trajectories; however, multi-animal tracking in space science experimental videos remains highly challenging due to weak appearance cues, low-quality imaging, complex maneuvering behaviors, and frequent interactions. To address this problem, we first construct the SpaceAnimal-MOT dataset to characterize the motion complexity and long-term identity preservation challenges in biological videos acquired under microgravity conditions. We then propose ART-Track (Adaptive Robust Tracking), a motion-driven tracking framework tailored to this setting. Specifically, multi-model motion estimation is introduced to handle abrupt maneuvers and nonlinear motion, motion-state-driven association is designed to reduce identity switches under dense interactions and temporary mismatch, and uncertainty-adaptive fusion is used to dynamically balance spatial and motion cues when prediction reliability varies. Experimental results show that ART-Track significantly reduces identity switches on zebrafish and fruitfly sequences, while maintaining more stable association under occlusion, deformation, and high-density interactions, thereby providing a more reliable tracking foundation for downstream quantitative behavior analysis. The code is publicly available at https://github.com/yyy7777777/ART_TRACK/tree/main.