The Conjugate Domain Dichotomy: Exact Risk of M-Estimators under Infinite-Variance Noise in High Dimensions
arXiv stat.ML / 3/31/2026
💬 OpinionIdeas & Deep AnalysisModels & Research
Key Points
- The paper analyzes high-dimensional M-estimation in the proportional asymptotic regime (p/n → γ > 0) under infinite-variance noise with regularly varying tails (tail index α ∈ (1,2)).
- It shows that the asymptotic risk behavior of a regularized M-estimator is determined by a geometric property of the loss function: whether the domain of its Fenchel conjugate is bounded or unbounded.
- For bounded conjugate-domain losses (e.g., Huber, absolute-value, and quantile), the dual variable is effectively constrained, the relevant noise impact reduces to a finite first absolute moment, and the estimator attains bounded risk without external/transfer information.
- For unbounded conjugate-domain losses such as squared loss, the dual variable grows with the noise, the risk depends on the diverging second moment, and bounded risk requires transfer regularization toward an external prior.
- For the squared-loss case, the authors derive the exact asymptotic risk using the Convex Gaussian Minimax Theorem with noise-adapted regularization, leading to a trichotomy: non-transfer squared-loss risk diverges, Huber-style boundedness yields non-vanishing risk, and transfer-regularized methods reach a universal risk floor.
Related Articles
Why AI agent teams are just hoping their agents behave
Dev.to

Harness as Code: Treating AI Workflows Like Infrastructure
Dev.to

How to Make Claude Code Better at One-Shotting Implementations
Towards Data Science

The Crypto AI Agent Stack That Costs $0/Month to Run
Dev.to

Bag of Freebies for Training Object Detection Neural Networks
Dev.to