en
Scientific article
Open access
English

Explicit stabilised gradient descent for faster strongly convex optimisation

Published inBIT Numerical Mathematics, vol. 61, no. 1, p. 119-139
Publication date2021
Abstract

We introduce the explicit stabilised gradient descent method (ESGD) for strongly convex optimisation problems. This new algorithm is based on explicit stabilised integrators for stiff differential equations, a powerful class of numerical schemes to avoid the severe step size restriction faced by standard explicit integrators. For optimising quadratic and strongly convex functions, we prove that ESGD nearly achieves the optimal convergence rate of the conjugate gradient algorithm, and the suboptimality of ESGD diminishes as the condition number of the quadratic function worsens. We show that this optimal rate is obtained also for a partitioned variant of ESGD applied to perturbations of quadratic functions. In addition, numerical experiments on general strongly convex problems show that ESGD outperforms Nesterov's accelerated gradient descent.

Research group
Citation (ISO format)
EFTEKHARI, Armin et al. Explicit stabilised gradient descent for faster strongly convex optimisation. In: BIT Numerical Mathematics, 2021, vol. 61, n° 1, p. 119–139. doi: 10.1007/s10543-020-00819-y
Main files (1)
Article (Published version)
Identifiers
ISSN of the journal0006-3835
144views
113downloads

Technical informations

Creation04/20/2021 9:01:00 PM
First validation04/20/2021 9:01:00 PM
Update time03/16/2023 12:26:11 AM
Status update03/16/2023 12:26:10 AM
Last indexation02/12/2024 12:04:29 PM
All rights reserved by Archive ouverte UNIGE and the University of GenevaunigeBlack