BALM: A Model-Agnostic Framework for Balanced Multimodal Learning under Imbalanced Missing Rates

arXiv cs.CV / 3/23/2026

📰 NewsIdeas & Deep AnalysisModels & Research

Key Points

  • BALM is a model-agnostic plug-in framework designed to enable balanced multimodal learning under imbalanced missing rates (IMR), addressing the dominance of information-rich modalities.
  • It includes two modules: the Feature Calibration Module (FCM) that recalibrates unimodal features using global context to align representations across missing patterns, and the Gradient Rebalancing Module (GRM) that modulates gradient magnitudes and directions to balance learning across modalities from distributional and spatial perspectives.
  • BALM is backbone-agnostic and can be integrated into diverse architectures, including multimodal emotion recognition (MER) models, without changing their structure.
  • Experimental results on multiple MER benchmarks show BALM improves robustness and performance under various missing/imbalance settings, with code available at the provided GitHub repository.

Abstract

Learning from multiple modalities often suffers from imbalance, where information-rich modalities dominate optimization while weaker or partially missing modalities contribute less. This imbalance becomes severe in realistic settings with imbalanced missing rates (IMR), where each modality is absent with different probabilities, distorting representation learning and gradient dynamics. We revisit this issue from a training-process perspective and propose BALM, a model-agnostic plug-in framework to achieve balanced multimodal learning under IMR. The framework comprises two complementary modules: the Feature Calibration Module (FCM), which recalibrates unimodal features using global context to establish a shared representation basis across heterogeneous missing patterns; the Gradient Rebalancing Module (GRM), which balances learning dynamics across modalities by modulating gradient magnitudes and directions from both distributional and spatial perspectives. BALM can be seamlessly integrated into diverse backbones, including multimodal emotion recognition (MER) models, without altering their architectures. Experimental results across multiple MER benchmarks confirm that BALM consistently enhances robustness and improves performance under diverse missing and imbalance settings. Code available at: https://github.com/np4s/BALM_CVPR2026.git