AI Navigate

Follow the Saliency: Supervised Saliency for Retrieval-augmented Dense Video Captioning

arXiv cs.CV / 3/13/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • STaRC introduces a supervised frame-level saliency approach for retrieval-augmented dense video captioning by using a highlight detection module trained directly from DVC ground-truth annotations without requiring extra labeling.
  • It uses saliency scores as a unified temporal signal to drive saliency-guided segmentation for retrieval and to inform caption generation through explicit Saliency Prompts injected into the decoder.
  • The approach yields temporally coherent segments that align with actual event transitions and achieves state-of-the-art performance on YouCook2 and ViTT across most metrics.
  • The code is available on GitHub, enabling replication and practical adoption of STaRC.

Abstract

Existing retrieval-augmented approaches for Dense Video Captioning (DVC) often fail to achieve accurate temporal segmentation aligned with true event boundaries, as they rely on heuristic strategies that overlook ground truth event boundaries. The proposed framework, \textbf{STaRC}, overcomes this limitation by supervising frame-level saliency through a highlight detection module. Note that the highlight detection module is trained on binary labels derived directly from DVC ground truth annotations without the need for additional annotation. We also propose to utilize the saliency scores as a unified temporal signal that drives retrieval via saliency-guided segmentation and informs caption generation through explicit Saliency Prompts injected into the decoder. By enforcing saliency-constrained segmentation, our method produces temporally coherent segments that align closely with actual event transitions, leading to more accurate retrieval and contextually grounded caption generation. We conduct comprehensive evaluations on the YouCook2 and ViTT benchmarks, where STaRC achieves state-of-the-art performance across most of the metrics. Our code is available at https://github.com/ermitaju1/STaRC