accelerated_proximal_gradient¶
-
odl.solvers.nonsmooth.proximal_gradient_solvers.
accelerated_proximal_gradient
(x, f, g, gamma, niter, callback=None, \*\*kwargs)[source]¶ Accelerated proximal gradient algorithm for convex optimization.
The method is known as “Fast Iterative Soft-Thresholding Algorithm” (FISTA). See [Beck2009] for more information.
Solves the convex optimization problem:
min_{x in X} f(x) + g(x)
where the proximal operator of
f
is known andg
is differentiable.- Parameters
- x
f.domain
element Starting point of the iteration, updated in-place.
- f
Functional
The function
f
in the problem definition. Needs to havef.proximal
.- g
Functional
The function
g
in the problem definition. Needs to haveg.gradient
.- gammapositive float
Step size parameter.
- niternon-negative int, optional
Number of iterations.
- callbackcallable, optional
Function called with the current iterate after each iteration.
- x
Notes
The problem of interest is
where the formal conditions are that is proper, convex and lower-semicontinuous, and is differentiable and is -Lipschitz continuous.
Convergence is only guaranteed if the step length satisfies
References