Continual Multimodal Egocentric Activity Recognition via Modality-Aware Novel Detection
arXiv cs.CV / 3/19/2026
📰 NewsIdeas & Deep AnalysisModels & Research
Key Points
- The authors propose MAND, a modality-aware framework for multimodal egocentric open-world continual learning to detect novel activities while learning from non-stationary streams.
- It introduces Modality-aware Adaptive Scoring (MoAS) to estimate sample-wise modality reliability from energy scores and adaptively fuse modality logits to better exploit cues from multiple modalities, especially IMU.
- During training, Modality-wise Representation Stabilization Training (MoRST) preserves modality-specific discriminability across tasks via auxiliary heads and modality-wise logit distillation.
- The approach addresses RGB-dominated logits and underutilized IMU cues, mitigating catastrophic forgetting in open-world settings.
- Experiments on a public multimodal egocentric benchmark show up to 10% improvement in novel activity detection AUC and up to 2.8% improvement in known-class accuracy over state-of-the-art baselines.
Related Articles
I Was Wrong About AI Coding Assistants. Here's What Changed My Mind (and What I Built About It).
Dev.to

Interesting loop
Reddit r/LocalLLaMA
Qwen3.5-122B-A10B Uncensored (Aggressive) — GGUF Release + new K_P Quants
Reddit r/LocalLLaMA
A supervisor or "manager" Al agent is the wrong way to control Al
Reddit r/artificial
FeatherOps: Fast fp8 matmul on RDNA3 without native fp8
Reddit r/LocalLLaMA