# chainer.cross_entropy¶

chainer.cross_entropy(dist1, dist2)[source]

Computes Cross entropy.

For two continuous distributions $$p(x), q(x)$$, it is expressed as

$H(p,q) = - \int p(x) \log q(x) dx$

For two discrete distributions $$p(x), q(x)$$, it is expressed as

$H(p,q) = - \sum_x p(x) \log q(x)$

This function call kl_divergence() and entropy() of dist1. Therefore, it is necessary to register KL divergence function with register_kl() decoartor and define entropy() in dist1.

Parameters
• dist1 (Distribution) – Distribution to calculate cross entropy $$p$$. This is the first (left) operand of the cross entropy.

• dist2 (Distribution) – Distribution to calculate cross entropy $$q$$. This is the second (right) operand of the cross entropy.

Returns

Output variable representing cross entropy $$H(p,q)$$.

Return type

Variable