accelerated_proximal_gradient

odl.solvers.nonsmooth.proximal_gradient_solvers.accelerated_proximal_gradient(x, f, g, gamma, niter, callback=None, \*\*kwargs)[source]

Accelerated proximal gradient algorithm for convex optimization.

The method is known as “Fast Iterative Soft-Thresholding Algorithm” (FISTA). See [Beck2009] for more information.

Solves the convex optimization problem:

min_{x in X} f(x) + g(x)

where the proximal operator of f is known and g is differentiable.

Parameters
xf.domain element

Starting point of the iteration, updated in-place.

fFunctional

The function f in the problem definition. Needs to have f.proximal.

gFunctional

The function g in the problem definition. Needs to have g.gradient.

gammapositive float

Step size parameter.

niternon-negative int, optional

Number of iterations.

callbackcallable, optional

Function called with the current iterate after each iteration.

Notes

The problem of interest is

\min_{x \in X} f(x) + g(x),

where the formal conditions are that f : X \to \mathbb{R} is proper, convex and lower-semicontinuous, and g : X \to \mathbb{R} is differentiable and \nabla g is 1 / \beta-Lipschitz continuous.

Convergence is only guaranteed if the step length \gamma satisfies

0 < \gamma < 2 \beta.

References