Communication Dans Un Congrès Année : 2025

From stability of Langevin diffusion to convergence of proximal MCMC for non-log-concave sampling

Résumé

We consider the problem of sampling distributions stemming from non-convex potentials with Unadjusted Langevin Algorithm (ULA). We prove the stability of the discrete-time ULA to drift approximations under the assumption that the potential is strongly convex at infinity. In many context, e.g. imaging inverse problems, potentials are non-convex and non-smooth. Proximal Stochastic Gradient Langevin Algorithm (PSGLA) is a popular algorithm to handle such potentials. It combines the forward-backward optimization algorithm with a ULA step. Our main stability result combined with properties of the Moreau envelope allows us to derive the first proof of convergence of the PSGLA for non-convex potentials. We empirically validate our methodology on synthetic data and in the context of imaging inverse problems. In particular, we observe that PSGLA exhibits faster convergence rates than Stochastic Gradient Langevin Algorithm for posterior sampling while preserving its restoration properties.

Fichier principal
Vignette du fichier
2505.14177v1.pdf (29.83 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-05125391 , version 1 (23-06-2025)

Licence

Identifiants

Citer

Marien Renaud, Valentin de Bortoli, Arthur Leclaire, Nicolas Papadakis. From stability of Langevin diffusion to convergence of proximal MCMC for non-log-concave sampling. NeurIPS 2025 - 39th Annual Conference on Neural Information Processing Systems, Dec 2025, San Diego (California), United States. ⟨hal-05125391⟩
5023 Consultations
82 Téléchargements

Altmetric

Partager

  • More