A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems

Abstract Extrapolation, restart and stepsize are very powerful strategies for accelerating the convergence rates of first-order algorithms. In this paper, we propose a modified accelerated proximal gradient algorithm (modAPG), which incorporates the adaptive nonmonotone stepsize strategy, extrapolat...
Ausführliche Beschreibung

Gespeichert in:
Autor*in:

Wang, Ting [verfasserIn]

Liu, Hongwei

Format:

E-Artikel

Sprache:

Englisch

Erschienen:

2023

Schlagwörter:

Accelerated proximal gradient method

Kurdyka-Łojasiewicz property

Convergence

Anmerkung:

© The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Übergeordnetes Werk:

Enthalten in: Numerical algorithms - Bussum : Baltzer, 1991, 95(2023), 1 vom: 30. Juni, Seite 207-241

Übergeordnetes Werk:

volume:95 ; year:2023 ; number:1 ; day:30 ; month:06 ; pages:207-241

Links:

Volltext

DOI / URN:

10.1007/s11075-023-01569-y

Katalog-ID:

SPR054261503

Nicht das Richtige dabei?

Schreiben Sie uns!