Call a function without storing internal results.
On a forward propagation Chainer stores all internal results of
Functionon a computational graph as they are required on backward-propagation. These results consume too much memory when the internal results are too large. This method forgets such internal results on forward propagation, and still supports back-propagation with recalculation.
In a forward propagation, this method calls a given function with given variables without creating a computational graph. That means, no internal results are stored. In a backward propagation this method calls the given function again to create a computational graph to execute back-propagation.
This method reduces internal memory usage. Instead it requires more calculation time as it calls the function twice.
fbe a function defined as:
>>> def f(a, b): ... return a + b + a
>>> x = chainer.Variable(np.random.uniform(-1, 1, 5).astype('f')) >>> y = chainer.Variable(np.random.uniform(-1, 1, 5).astype('f'))
zis calculated as
z = f(x, y), its internal result
x + yis stored in memory. Instead if you call
>>> z = F.forget(f, x, y)
x + yis forgotten.
funcreturns. If it returns a tuple, the method returns a tuple too.