Make a context manager which disables back-propagation.
In this context, Chainer does not make a computational graph.
Variablecreated in this context does not have reference to the
Functionwhich created the variable. So, you cannot compute gradient with
backward(). Instead memory consumption is reduced.
In this example,
yis created in this context. So you cannot call
>>> x = chainer.Variable(np.array([1,], 'f')) >>> with chainer.no_backprop_mode(): ... y = x + 1