Make a context manager which disables back-propagation.

In this context, Chainer does not make a computational graph. Variable created in this context does not have reference to the Function which created the variable. So, you cannot compute gradient with backward(). Instead memory consumption is reduced.

In this example, y is created in this context. So you cannot call backward().

>>> x = chainer.Variable(np.array([1,], 'f'))
>>> with chainer.no_backprop_mode():
...   y = x + 1