# chainer.kl_divergence¶

chainer.kl_divergence(dist1, dist2)[source]

Computes Kullback-Leibler divergence.

For two continuous distributions $$p(x), q(x)$$, it is expressed as

$D_{KL}(p||q) = \int p(x) \log \frac{p(x)}{q(x)} dx$

For two discrete distributions $$p(x), q(x)$$, it is expressed as

$D_{KL}(p||q) = \sum_x p(x) \log \frac{p(x)}{q(x)}$
Parameters
• dist1 (Distribution) – Distribution to calculate KL divergence $$p$$. This is the first (left) operand of the KL divergence.

• dist2 (Distribution) – Distribution to calculate KL divergence $$q$$. This is the second (right) operand of the KL divergence.

Returns

Output variable representing kl divergence $$D_{KL}(p||q)$$.

Return type

Variable

Using register_kl(), we can define behavior of kl_divergence() for any two distributions.