AI Navigate

HelixTrack: Event-Based Tracking and RPM Estimation of Propeller-like Objects

arXiv cs.CV / 3/11/2026

Models & Research

Key Points

  • HelixTrack is a novel event-driven method designed for microsecond-latency tracking and RPM estimation of fast, periodic motion in propeller-like objects, addressing the limitations of traditional frame-based and event-based trackers.
  • The method uses back-warping of incoming events via a homography and a Kalman Filter for instantaneous phase estimates, with iterative updates refining object pose by coupling phase residuals to geometry.
  • HelixTrack sets a new standard by outperforming baseline methods in speed and accuracy, processing events at approximately 11.8 times real time with microsecond latency.
  • To support this research, the authors introduce the Timestamped Quadcopter with Egomotion (TQE) dataset, containing 13 high-resolution event sequences of 52 rotating objects, providing microsecond RPM ground truth under varying egomotion.
  • The combination of HelixTrack and TQE dataset significantly advances safety-critical perception for unmanned aerial vehicles and rotating machinery by improving tracking and RPM estimation under challenging conditions.

Computer Science > Computer Vision and Pattern Recognition

arXiv:2603.09235 (cs)
[Submitted on 10 Mar 2026]

Title:HelixTrack: Event-Based Tracking and RPM Estimation of Propeller-like Objects

View a PDF of the paper titled HelixTrack: Event-Based Tracking and RPM Estimation of Propeller-like Objects, by Radim Spetlik and 3 other authors
View PDF HTML (experimental)
Abstract:Safety-critical perception for unmanned aerial vehicles and rotating machinery requires microsecond-latency tracking of fast, periodic motion under egomotion and strong distractors. Frame-based and event-based trackers drift or break on propellers because periodic signatures violate their smooth-motion assumptions. We tackle this gap with HelixTrack, a fully event-driven method that jointly tracks propeller-like objects and estimates their rotations per minute (RPM). Incoming events are back-warped from the image plane into the rotor plane via a homography estimated on the fly. A Kalman Filter maintains instantaneous estimates of phase. Batched iterative updates refine the object pose by coupling phase residuals to geometry. To our knowledge, no public dataset targets joint tracking and RPM estimation of propeller-like objects. We therefore introduce the Timestamped Quadcopter with Egomotion (TQE) dataset with 13 high-resolution event sequences, containing 52 rotating objects in total, captured at distances of 2 m / 4 m, with increasing egomotion and microsecond RPM ground truth. On TQE, HelixTrack processes full-rate events (approx. 11.8x real time) faster than real time and microsecond latency. It consistently outperforms per-event and aggregation-based baselines adapted for RPM estimation.
Subjects: Computer Vision and Pattern Recognition (cs.CV)
Cite as: arXiv:2603.09235 [cs.CV]
  (or arXiv:2603.09235v1 [cs.CV] for this version)
  https://doi.org/10.48550/arXiv.2603.09235
Focus to learn more
arXiv-issued DOI via DataCite

Submission history

From: Radim Spetlik [view email]
[v1] Tue, 10 Mar 2026 06:12:26 UTC (19,169 KB)
Full-text links:

Access Paper:

Current browse context:
cs.CV
< prev   |   next >
Change to browse by:
cs

References & Citations

export BibTeX citation Loading...

BibTeX formatted citation

×
Data provided by:

Bookmark

BibSonomy logo Reddit logo
Bibliographic Tools

Bibliographic and Citation Tools

Bibliographic Explorer Toggle
Bibliographic Explorer (What is the Explorer?)
Connected Papers Toggle
Connected Papers (What is Connected Papers?)
Litmaps Toggle
Litmaps (What is Litmaps?)
scite.ai Toggle
scite Smart Citations (What are Smart Citations?)
Code, Data, Media

Code, Data and Media Associated with this Article

alphaXiv Toggle
alphaXiv (What is alphaXiv?)
Links to Code Toggle
CatalyzeX Code Finder for Papers (What is CatalyzeX?)
DagsHub Toggle
DagsHub (What is DagsHub?)
GotitPub Toggle
Gotit.pub (What is GotitPub?)
Huggingface Toggle
Hugging Face (What is Huggingface?)
Links to Code Toggle
Papers with Code (What is Papers with Code?)
ScienceCast Toggle
ScienceCast (What is ScienceCast?)
Demos

Demos

Replicate Toggle
Replicate (What is Replicate?)
Spaces Toggle
Hugging Face Spaces (What is Spaces?)
Spaces Toggle
TXYZ.AI (What is TXYZ.AI?)
Related Papers

Recommenders and Search Tools

Link to Influence Flower
Influence Flower (What are Influence Flowers?)
Core recommender toggle
CORE Recommender (What is CORE?)
About arXivLabs

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs.