Scientific article
OA Policy
English

Explicit stabilised gradient descent for faster strongly convex optimisation

Published inBIT Numerical Mathematics, vol. 61, no. 1, p. 119-139
Publication date2021
Abstract

We introduce the explicit stabilised gradient descent method (ESGD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes to avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, we prove that ESGD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of ESGD diminishes as the condition number of the quadratic function worsens. We show that this optimal rate is obtained also for a partitioned variant of ESGD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that ESGD outperforms Nesterov's accelerated gradient descent.

Research groups
Citation (ISO format)
EFTEKHARI, Armin et al. Explicit stabilised gradient descent for faster strongly convex optimisation. In: BIT Numerical Mathematics, 2021, vol. 61, n° 1, p. 119–139. doi: 10.1007/s10543-020-00819-y
Main files (1)
Article (Published version)
Identifiers
Additional URL for this publicationhttp://link.springer.com/10.1007/s10543-020-00819-y
Journal ISSN0006-3835
213views
168downloads

Technical informations

Creation20/04/2021 21:01:00
First validation20/04/2021 21:01:00
Update time16/03/2023 00:26:11
Status update16/03/2023 00:26:10
Last indexation31/10/2024 21:53:50
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack