[Dottorcomp] Seminari di Matematica Applicata, 12/05/2026, Eitan Tadmor, Simone Rebegoldi

Stefano Lisini stefano.lisini a unipv.it
Ven 8 Maggio 2026 12:47:26 CEST


Seminari di Matematica Applicata, Dipartimento di Matematica "F. Casorati"
e Istituto del CNR IMATI "E. Magenes" di Pavia.

Martedì 12 Maggio 2026, alle ore 14 precise, presso l'aula Beltrami del
Dipartimento di Matematica dell'Università di Pavia,

Eitan Tadmor (University of Maryland, College Park)

terrà un seminario dal titolo:

Swarm-Based Gradient Descent: A Multi-Agent Approach for Non-Convex
Optimization

e alle ore 15,00 precise

Simone Rebegoldi (Università di Modena e Reggio Emilia)terrà un seminario
dal titolo:

Block Coordinate Plug-and-Play Algorithms for Image Restoration.

----------------------------

Abstract (Tadmor): We discuss a novel class of swarm-based gradient descent
(SBGD) methods for non-convex optimization. The swarm consists of agents,
each is identified with position, *x*, and mass, *m*.  There are two key
ingredients in the SBGD dynamics:

(i) persistent transition of mass from agents at high to lower ground; and
(ii) time stepping protocol which decreases with *m*.

The interplay between positions and masses leads to dynamic distinction
between “leaders” and “explorers”: heavier agents lead the swarm near local
minima with small time steps; lighter agents use larger time steps to
explore the landscape in search of improved global minimum, by reducing the
overall ‘loss’ of the swarm. Convergence analysis and numerical simulations
demonstrate the effectiveness of SBGD method as a global optimizer.

----------------------------

Abstract (Rebegoldi): Plug-and-Play (PnP) algorithms represent a powerful
and innovative paradigm for solving inverse problems in computational
imaging. These methods replace the usual explicit regularization step with
an implicit step, implemented by means of a denoising algorithm. In the
last decade, PnP methods have achieved state-of-the-art results in several
image restoration tasks, often outperforming traditional regularization
techniques. In [1] a new denoising operator has been proposed, defined
explicitly as the gradient step of a potential function parametrized by a
convolutional neural network. Such denoisers allow for a
clearer variational interpretation of PnP methods; however, they are
high-memory consuming, as their implementation requires storing large
computational graphs.

We propose a novel class of block coordinate PnP methods to address imaging
inverse problems [2]. The block coordinate strategy is designed to reduce
the high memory consumption arising in PnP methods that rely on Gradient
Step denoisers. The proposed methods are based on a block coordinate
forward backward framework for solving nonconvex and nonseparable composite
optimization problems. Under mild assumptions on the objective function, we
establish a sublinear convergence rate and the stationarity of the limit
points. Moreover, convergence of the entire sequence of the iterates is
guaranteed under a Kurdyka-Łojasiewicz assumption. Numerical experiments on
ill-posed imaging problems, including deblurring and super-resolution,
demonstrate that the proposed PnP approach achieves state-of-the-art
reconstruction quality while substantially reducing GPU memory requirements.

[1] S. Hurault, A. Leclaire, and N. Papadakis, Gradient step denoiser for
convergent Plug-and-Play, in International Conference on Learning
Representations (ICLR), 2022.

[2] F. Porta, S. Rebegoldi, A. Sebastiani, Block-coordinate Plug-And-Play
Methods with Armijo-like line-search for Image Restoration,
arXiv:2603.01734, 2026.
-----------------------------------------
Pagina web Seminari di Matematica Applicata
https://matematica.unipv.it/ricerca/cicli-di-seminari/seminari-di-matematica-applicata/



Maggiori informazioni sulla lista Dottorcomp