Pré-Publication, Document De Travail Année : 2025

Proximal gradient descent on the smoothed duality gap to solve saddle point problems

Résumé

In this paper, we minimize the self-centered smoothed gap, a recently introduced optimality measure, in order to solve convex-concave saddle point problems. The self-centered smoothed gap can be computed as the sum of a convex, possibly nonsmooth function and a smooth weakly convex function. Although it is not convex, we propose an algorithm that minimizes this quantity, effectively reducing convex-concave saddle point problems to a minimization problem. Its worst case complexity is comparable to the one of the restarted and averaged primal dual hybrid gradient method, and the algorithm enjoys linear convergence in favorable cases.

Fichier principal
Vignette du fichier
minimize_smoothed_gap.pdf (431 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-05346630 , version 1 (04-11-2025)

Identifiants

  • HAL Id : hal-05346630 , version 1

Citer

Olivier Fercoq. Proximal gradient descent on the smoothed duality gap to solve saddle point problems. 2025. ⟨hal-05346630⟩
1918 Consultations
37 Téléchargements

Partager

  • More