chainer.no_backprop_mode¶
-
chainer.
no_backprop_mode
()[source]¶ Make a context manager which disables back-propagation.
In this context, Chainer does not make a computational graph.
Variable
created in this context does not have reference to theFunction
which created the variable. So, you cannot compute gradient withbackward()
. Instead memory consumption is reduced.In this example,
y
is created in this context. So you cannot callbackward()
.>>> x = chainer.Variable(numpy.array([1,], 'f')) >>> with chainer.no_backprop_mode(): ... y = x + 1