proximal_convex_conj_kl_cross_entropy

odl.solvers.nonsmooth.proximal_operators.proximal_convex_conj_kl_cross_entropy(space, lam=1, g=None)[source]

Proximal factory of the convex conj of cross entropy KL divergence.

Function returning the proximal factory of the convex conjugate of the functional F, where F is the cross entropy Kullback-Leibler (KL) divergence given by:

F(x) = sum_i (x_i ln(pos(x_i)) - x_i ln(g_i) + g_i - x_i) + ind_P(x)

with x and g in the linear space X, and g non-negative. Here, pos denotes the nonnegative part, and ind_P is the indicator function for nonnegativity.

Parameters:
spaceTensorSpace

Space X which is the domain of the functional F

lampositive float, optional

Scaling factor.

gspace element, optional

Data term, positive. If None it is take as the one-element.

Returns:
prox_factoryfunction

Factory for the proximal operator to be initialized.

See also

proximal_convex_conj_kl

proximal for related functional

Notes

The functional is given by the expression

F(x) = \sum_i (x_i \ln(pos(x_i)) - x_i \ln(g_i) + g_i - x_i) +
I_{x \geq 0}(x)

The indicator function I_{x \geq 0}(x) is used to restrict the domain of F such that F is defined over whole space X. The non-negativity thresholding pos is used to define F in the real numbers.

Note that the functional is not well-defined without a prior g. Hence, if g is omitted this will be interpreted as if g is equal to the one-element.

The convex conjugate F^* of F is

F^*(p) = \sum_i g_i (exp(p_i) - 1)

where p is the variable dual to x.

The proximal operator of the convex conjugate of F is

\mathrm{prox}_{\sigma (\lambda F)^*}(x) = x - \lambda
W(\frac{\sigma}{\lambda} g e^{x/\lambda})

where \sigma is the step size-like parameter, \lambda is the weighting in front of the function F, and W is the Lambert W function (see, for example, the Wikipedia article).

For real-valued input x, the Lambert W function is defined only for x \geq -1/e, and it has two branches for values -1/e \leq x < 0. However, for inteneded use-cases, where \lambda and g are positive, the argument of W will always be positive.

Wikipedia article on Kullback Leibler divergence. For further information about the functional, see for example this article.

The KL cross entropy functional F, described above, is related to another functional functional also know as KL divergence. This functional is often used as data discrepancy term in inverse problems, when data is corrupted with Poisson noise. This functional is obtained by changing place of the prior and the variable. See the See Also section.