forward_backward_pd

odl.solvers.nonsmooth.forward_backward.forward_backward_pd(x, f, g, L, h, tau, sigma, niter, callback=None, **kwargs)[source]

The forward-backward primal-dual splitting algorithm.

The algorithm minimizes the sum of several convex functionals composed with linear operators:

min_x f(x) + sum_i g_i(L_i x) + h(x)

where f, g_i are convex functionals, L_i are linear operators, and h is a convex and differentiable functional.

The method can also be used to solve the more general problem:

min_x f(x) + sum_i (g_i @ l_i)(L_i x) + h(x)

where l_i are strongly convex functionals and @ is the infimal convolution:

(g @ l)(x) = inf_y { g(y) + l(x-y) }

Note that the strong convexity of l_i makes the convex conjugate l_i^* differentiable; see the Notes section for more information on this.

Parameters:
xLinearSpaceElement

Initial point, updated in-place.

fFunctional

The functional f. Needs to have f.proximal.

gsequence of Functional's

The functionals g_i. Needs to have g_i.convex_conj.proximal.

Lsequence of Operator's'

Sequence of linear operators L_i, with as many elements as g.

hFunctional

The functional h. Needs to have h.gradient.

taufloat

Step size-like parameter for f.

sigmasequence of floats

Sequence of step size-like parameters for the sequence g.

niterint

Number of iterations.

callbackcallable, optional

Function called with the current iterate after each iteration.

Other Parameters:
lsequence of Functional's, optional

The functionals l_i. Needs to have g_i.convex_conj.gradient. If omitted, the simpler problem without l_i will be considered.

See also

odl.solvers.nonsmooth.primal_dual_hybrid_gradient.pdhg

Solver for similar problems without differentiability in any of the terms.

odl.solvers.nonsmooth.douglas_rachford.douglas_rachford_pd

Solver for similar problems without differentiability in any of the terms.

Notes

The mathematical problem to solve is

\min_x f(x) + \sum_{i=0}^n (g_i \Box l_i)(L_i x) + h(x),

where f, g_i, l_i and h are functionals and L_i are linear operators. The infimal convolution g \Box l is defined by

(g \Box l)(x) = \inf_y g(y) + l(x - y).

The exact conditions on the involved functionals are as follows: f and g_i are proper, convex and lower semicontinuous, and h is convex and differentiable with \eta^{-1}-Lipschitz continuous gradient, \eta > 0.

The optional operators \nabla l_i^* need to be \nu_i-Lipschitz continuous. Note that in the paper, the condition is formulated as l_i being proper, lower semicontinuous, and \nu_i^{-1}-strongly convex, which implies that l_i^* have \nu_i-Lipschitz continuous gradients.

If the optional operators \nabla l_i^* are omitted, the simpler problem without l_i will be considered. Mathematically, this is done by taking l_i to be the functionals that are zero only in the zero element and \infty otherwise. This gives that l_i^* are the zero functionals, and hence the corresponding gradients are the zero operators.

To guarantee convergence, the parameters \tau, \sigma and L_i need to satisfy

2 \min \{ \frac{1}{\tau}, \frac{1}{\sigma_1}, \ldots,
\frac{1}{\sigma_m} \} \cdot \min\{ \eta, \nu_1, \ldots, \nu_m  \}
\cdot \sqrt{1 - \tau \sum_{i=1}^n \sigma_i ||L_i||^2} > 1,

where, if the simpler problem is considered, all \nu_i can be considered to be \infty.

For reference on the forward-backward primal-dual algorithm, see [BC2015].

For more on proximal operators and algorithms see [PB2014].

References

[BC2015] Bot, R I, and Csetnek, E R. On the convergence rate of a forward-backward type primal-dual splitting algorithm for convex optimization problems. Optimization, 64.1 (2015), pp 5--23.

[PB2014] Parikh, N, and Boyd, S. Proximal Algorithms. Foundations and Trends in Optimization, 1 (2014), pp 127-239.