Soft-MSM: Differentiable Context-Aware Elastic Alignment for Time Series
arXiv cs.LG / 5/4/2026
📰 NewsTools & Practical UsageModels & Research
Key Points
- Soft-MSM is introduced as a differentiable, context-aware elastic alignment loss that smooths the Move-Split-Merge (MSM) distance for gradient-based time series learning.
- The method replaces MSM’s piecewise split/merge transition costs with a smooth gated surrogate, enabling gradients to flow through both the dynamic-programming recursion and the local, alignment-dependent transition structure.
- The paper derives the forward/backward recursions, soft alignment matrix, closed-form gradients, and discusses limiting behavior and a divergence-corrected formulation.
- Experiments on 112 UCR datasets show Soft-MSM achieves lower MSM barycentre loss than prior MSM barycentre methods and improves clustering and nearest-centroid classification versus Soft-DTW-based alternatives.
- An open-source implementation is provided in the aeon toolkit, facilitating adoption in time-series ML workflows.

