chainer.gradient_check.check_double_backward

chainer.gradient_check.check_double_backward(func, x_data, y_grad, x_grad_grad, params=(), params_grad_grad=(), eps=0.001, atol=0.0001, rtol=0.001, no_grads=None, dtype=None, detect_nondifferentiable=False)[source]

Test twice differentiation of a given procedure.

This function automatically checks if the backward procedure of func is correctly implemented for further differentiation. It first computes the gradient of func w.r.t. its inputs in the same way as check_backward(). This function then further invokes the backward procedure against the gradient variables, starting from the initial gradient given by x_grad_grad. It also computes the second gradient using numerical_grad(). The resulting gradients are compared to confirm if the second-order gradients are approximately correct.

Note that this function DOES NOT check if the first-order differentiation is correct; the numerical gradient assumes that the first-order gradient given by the usual chainer.Variable.backward() is correct. The implementation of each differentiable function should be tested by check_backward() first, and then should be tested by this function if necessary.

For the details of the arguments, see check_backward(). The additional arguments x_grad_grad and params_grad_grad are (tuples of) Variable (s) that include the initial gradient corresponding to the first-order gradient of each input and parameter. Note that the default error tolerance atol and rtol are slightly larger than those of check_backward() because the numerical gradients of the second order differentiation are less accurate than those of the first order gradients.