On the Objective and Feature Weights of Minkowski Weighted k-Means
arXiv cs.LG / 3/30/2026
💬 OpinionSignals & Early TrendsIdeas & Deep AnalysisModels & Research
Key Points
- The paper analyzes Minkowski weighted k-means (mwk-means), extending classical k-means with feature weights and a Minkowski distance, to bridge gaps in its theoretical understanding despite strong empirical performance.
- It reformulates the mwk-means objective as a power-mean aggregation of within-cluster dispersions, showing that the Minkowski exponent p governs whether the method behaves more selectively or more uniformly across features.
- The authors derive bounds on the objective value and characterize the learned feature-weight structure, proving weights depend on relative dispersion and follow a power-law relationship with dispersion ratios.
- The resulting theory provides explicit guarantees on how high-dispersion (less reliable) features are suppressed.
- The paper also establishes convergence and offers a unified theoretical interpretation of mwk-means behavior.




