BOLT: Online Lightweight Adaptation for Preparation-Free Heterogeneous Cooperative Perception
arXiv cs.CV / 5/4/2026
📰 NewsDeveloper Stack & InfrastructureModels & Research
Key Points
- The paper introduces “preparation-free” heterogeneous cooperative perception, targeting scenarios where agents meet online and cannot rely on offline joint training or tailored adaptation.
- It finds that straightforward cross-agent feature fusion can perform worse than ego-only perception when no prior coordination is available.
- To address this, the authors propose BOLT, a lightweight plug-and-play module that performs online adaptation using ego-as-teacher distillation without requiring ground-truth labels.
- BOLT uses high-confidence ego features to align neighbor features and allows neighbor contributions particularly in ego’s low-confidence regions.
- Experiments show large gains in AP@50 (up to +32.3 points) using only 0.9M trainable parameters, consistently outperforming ego-only baselines on DAIR-V2X and OPV2V.
Related Articles
AnnouncementsBuilding a new enterprise AI services company with Blackstone, Hellman & Friedman, and Goldman Sachs
Anthropic News

Dara Khosrowshahi on replacing Uber drivers — and himself — with AI
The Verge
CLMA Frame Test
Dev.to
Governance and Liability in AI Agents: What I Built Trying to Answer Those Questions
Dev.to

Roundtable chat with Talkie-1930 and Gemma 4 31B
Reddit r/LocalLLaMA