Nonasymptotic Convergence Rates for Plug-and-Play Methods With MMSE Denoisers

arXiv stat.ML / 3/25/2026

💬 OpinionIdeas & Deep AnalysisModels & Research

Key Points

  • The paper shows that an MMSE denoiser under Gaussian noise can be represented as a proximal operator, but goes further by explicitly characterizing the corresponding implicit regularizer in plug-and-play (PnP) methods.
  • It derives that this regularizer can be written as an upper Moreau envelope of the negative log-marginal density, which implies the regularizer is 1-weakly convex.
  • Using 1-weak convexity, the authors establish what they describe as the first sublinear convergence rate guarantee for PnP proximal gradient descent when using an MMSE denoiser.
  • The theory is supported by experiments, including a 1D synthetic study that recovers the implicit regularizer and imaging applications (deblurring and computed tomography) that match the predicted sublinear convergence behavior.

Abstract

It is known that the minimum-mean-squared-error (MMSE) denoiser under Gaussian noise can be written as a proximal operator, which suffices for asymptotic convergence of plug-and-play (PnP) methods but does not reveal the structure of the induced regularizer or give convergence rates. We show that the MMSE denoiser corresponds to a regularizer that can be written explicitly as an upper Moreau envelope of the negative log-marginal density, which in turn implies that the regularizer is 1-weakly convex. Using this property, we derive (to the best of our knowledge) the first sublinear convergence guarantee for PnP proximal gradient descent with an MMSE denoiser. We validate the theory with a one-dimensional synthetic study that recovers the implicit regularizer. We also validate the theory with imaging experiments (deblurring and computed tomography), which exhibit the predicted sublinear behavior.