dca

odl.solvers.nonsmooth.difference_convex.dca(x, f, g, niter, callback=None)[source]

Subgradient DCA of Tao and An.

This algorithm solves a problem of the form

min_x f(x) - g(x),

where f and g are proper, convex and lower semicontinuous functions.

Parameters
xLinearSpaceElement

Initial point, updated in-place.

fFunctional

Convex functional. Needs to implement f.convex_conj.gradient.

gFunctional

Convex functional. Needs to implement g.gradient.

niterint

Number of iterations.

callbackcallable, optional

Function called with the current iterate after each iteration.

See also

prox_dca

Solver with a proximal step for f and a subgradient step for g.

doubleprox_dc

Solver with proximal steps for all the nonsmooth convex functionals and a gradient step for a smooth functional.

Notes

The algorithm is described in Section 3 and in particular in Theorem 3 of [TA1997]. The problem

\min f(x) - g(x)

has the first-order optimality condition 0 \in \partial f(x) -
\partial g(x), i.e., aims at finding an x so that there exists a common element

y \in \partial f(x) \cap \partial g(x).

The element y can be seen as a solution of the Toland dual problem

\min g^*(y) - f^*(y)

and the iteration is given by

y_n \in \partial g(x_n), \qquad x_{n+1} \in \partial f^*(y_n),

for n\geq 0. Here, a subgradient is found by evaluating the gradient method of the respective functionals.

References

[TA1997] Tao, P D, and An, L T H. Convex analysis approach to d.c. programming: Theory, algorithms and applications. Acta Mathematica Vietnamica, 22.1 (1997), pp 289–355.