proximal_gradient

odl.solvers.nonsmooth.proximal_gradient_solvers.proximal_gradient(x, f, g, gamma, niter, callback=None, **kwargs)[source]

(Accelerated) proximal gradient algorithm for convex optimization.

Also known as "Iterative Soft-Thresholding Algorithm" (ISTA). See [Beck2009] for more information.

This solver solves the convex optimization problem:

min_{x in X} f(x) + g(x)

where the proximal operator of f is known and g is differentiable.

Parameters:
xf.domain element

Starting point of the iteration, updated in-place.

fFunctional

The function f in the problem definition. Needs to have f.proximal.

gFunctional

The function g in the problem definition. Needs to have g.gradient.

gammapositive float

Step size parameter.

niternon-negative int, optional

Number of iterations.

callbackcallable, optional

Function called with the current iterate after each iteration.

Other Parameters:
lamfloat or callable, optional

Overrelaxation step size. If callable, it should take an index (starting at zero) and return the corresponding step size. Default: 1.0

Notes

The problem of interest is

\min_{x \in X} f(x) + g(x),

where the formal conditions are that f : X \to \mathbb{R} is proper, convex and lower-semicontinuous, and g : X \to \mathbb{R} is differentiable and \nabla g is 1 / \beta-Lipschitz continuous.

Convergence is only guaranteed if the step length \gamma satisfies

0 < \gamma < 2 \beta

and the parameter \lambda (lam) satisfies

\sum_{k=0}^\infty \lambda_k (\delta - \lambda_k) = + \infty

where \delta = \min \{1, \beta / \gamma\}.

References