Chainer
v5.0.0

Chainer Documents

  • Chainer at a Glance
  • Installation
  • Guides
  • Neural Net Examples
  • Reference
    • Variable and Parameter
    • Functions
    • Link and Chains
      • Learnable connections
        • chainer.links.Bias
        • chainer.links.Bilinear
        • chainer.links.ChildSumTreeLSTM
        • chainer.links.Convolution1D
        • chainer.links.Convolution2D
        • chainer.links.Convolution3D
        • chainer.links.ConvolutionND
        • chainer.links.Deconvolution1D
        • chainer.links.Deconvolution2D
        • chainer.links.Deconvolution3D
        • chainer.links.DeconvolutionND
        • chainer.links.DeformableConvolution2D
        • chainer.links.DepthwiseConvolution2D
        • chainer.links.DilatedConvolution2D
        • chainer.links.EmbedID
        • chainer.links.GRU
        • chainer.links.Highway
        • chainer.links.Inception
        • chainer.links.InceptionBN
        • chainer.links.Linear
        • chainer.links.LocalConvolution2D
        • chainer.links.LSTM
        • chainer.links.MLPConvolution2D
        • chainer.links.NaryTreeLSTM
        • chainer.links.NStepBiGRU
        • chainer.links.NStepBiLSTM
        • chainer.links.NStepBiRNNReLU
        • chainer.links.NStepBiRNNTanh
        • chainer.links.NStepGRU
        • chainer.links.NStepLSTM
        • chainer.links.NStepRNNReLU
        • chainer.links.NStepRNNTanh
        • chainer.links.Parameter
        • chainer.links.Scale
        • chainer.links.StatefulGRU
        • chainer.links.StatelessGRU
        • chainer.links.StatefulMGU
        • chainer.links.StatelessMGU
        • chainer.links.StatefulPeepholeLSTM
        • chainer.links.StatefulZoneoutLSTM
        • chainer.links.StatelessLSTM
      • Activation/loss/normalization functions with parameters
      • Machine learning models
      • Pre-trained models
      • Link and Chain base classes
      • Link hooks
    • Probability Distributions
    • Optimizers
    • Weight Initializers
    • Training Tools
    • Datasets
    • Iterator
    • Serializers
    • Utilities
    • Configuring Chainer
    • Debug Mode
    • Visualization of Computational Graph
    • Static Subgraph Optimizations: Usage
    • Static Subgraph Optimizations: Design Notes
    • Caffe Model Support
    • Assertion and Testing
  • Distributed Deep Learning with ChainerMN

Other

  • API Compatibility Policy
  • Contribution Guide
  • Tips and FAQs
  • Performance Best Practices
  • Upgrade Guide
  • Comparison with Other Frameworks
  • License

Community

  • Slack Chat
  • Forums
  • Notebook Examples
  • Examples in Awesome Chainer
Chainer
  • Docs »
  • Reference »
  • Link and Chains »
  • chainer.links.NStepGRU
  • Edit on GitHub

chainer.links.NStepGRU¶

class chainer.links.NStepGRU(self, n_layers, in_size, out_size, dropout)[source]¶

Stacked Uni-directional GRU for sequences.

This link is stacked version of Uni-directional GRU for sequences. It calculates hidden and cell states of all layer at end-of-string, and all hidden states of the last layer for each time.

Unlike chainer.functions.n_step_gru(), this function automatically sort inputs in descending order by length, and transpose the sequence. Users just need to call the link with a list of chainer.Variable holding sequences.

Warning

use_cudnn argument is not supported anymore since v2. Instead, use chainer.using_config('use_cudnn', use_cudnn). See chainer.using_config().

Parameters:
  • n_layers (int) – Number of layers.
  • in_size (int) – Dimensionality of input vectors.
  • out_size (int) – Dimensionality of hidden states and output vectors.
  • dropout (float) – Dropout ratio.

See also

chainer.functions.n_step_gru()

Methods

__call__(*args, **kwargs)[source]¶

Call self as a function.

__getitem__(index)[source]¶

Returns the child at given index.

Parameters:index (int) – Index of the child in the list.
Returns:The index-th child link.
Return type:Link
__setitem__(index, value)[source]¶
__len__()[source]¶

Returns the number of children.

__iter__()[source]¶
add_hook(hook, name=None)[source]¶

Registers a link hook.

Parameters:
  • hook (LinkHook) – Link hook to be registered.
  • name (str) – Name of the link hook. The name must be unique among link hooks registered to this link. If None, the default name of the link hook is used.
add_link(link)[source]¶

Registers a child link and adds it to the tail of the list.

Parameters:link (Link) – The link object to be registered.
add_param(name, shape=None, dtype=<class 'numpy.float32'>, initializer=None)[source]¶

Registers a parameter to the link.

Parameters:
  • name (str) – Name of the parameter. This name is also used as the attribute name.
  • shape (int or tuple of ints) – Shape of the parameter array. If it is omitted, the parameter variable is left uninitialized.
  • dtype – Data type of the parameter array.
  • initializer – If it is not None, the data is initialized with the given initializer. If it is an array, the data is directly initialized by it. If it is callable, it is used as a weight initializer. Note that in these cases, dtype argument is ignored.
add_persistent(name, value)[source]¶

Registers a persistent value to the link.

The registered value is saved and loaded on serialization and deserialization. The value is set to an attribute of the link.

Parameters:
  • name (str) – Name of the persistent value. This name is also used for the attribute name.
  • value – Value to be registered.
addgrads(link)[source]¶

Accumulates gradient values from given link.

This method adds each gradient array of the given link to corresponding gradient array of this link. The accumulation is even done across host and different devices.

Parameters:link (Link) – Source link object.
append(value)¶

S.append(value) – append value to the end of the sequence

children()[source]¶

Returns a generator of all child links.

Returns:A generator object that generates all child links.
clear() → None -- remove all items from S¶
cleargrads()[source]¶

Clears all gradient arrays.

This method should be called before the backward computation at every iteration of the optimization.

copy(mode='share')[source]¶

Returns a deep copy of the chainlist.

copyparams(link, copy_persistent=True)[source]¶

Copies all parameters from given link.

This method copies data arrays of all parameters in the hierarchy. The copy is even done across the host and devices. Note that this method does not copy the gradient arrays.

From v5.0.0: this method also copies the persistent values (e.g. the moving statistics of BatchNormalization). If the persistent value is an ndarray, the elements are copied. Otherwise, it is copied using copy.deepcopy(). The old behavior (not copying persistent values) can be reproduced with copy_persistent=False.

Parameters:
  • link (Link) – Source link object.
  • copy_persistent (bool) – If True, persistent values are also copied. True by default.
count(value) → integer -- return number of occurrences of value¶
count_params()[source]¶

Counts the total number of parameters.

This method counts the total number of scalar values included in all the Parameters held by this link and its descendants.

If the link containts uninitialized parameters, this method raises a warning.

Returns:The total size of parameters (int)
delete_hook(name)[source]¶

Unregisters the link hook.

Parameters:name (str) – The name of the link hook to be unregistered.
disable_update()[source]¶

Disables update rules of all parameters under the link hierarchy.

This method sets the enabled flag of the update rule of each parameter variable to False.

enable_update()[source]¶

Enables update rules of all parameters under the link hierarchy.

This method sets the enabled flag of the update rule of each parameter variable to True.

extend(values)¶

S.extend(iterable) – extend sequence by appending elements from the iterable

forward(self, hx, xs)[source]¶

Calculate all hidden states and cell states.

Warning

train argument is not supported anymore since v2. Instead, use chainer.using_config('train', train). See chainer.using_config().

Parameters:
  • hx (Variable or None) – Initial hidden states. If None is specified zero-vector is used. Its shape is (S, B, N) for uni-directional RNN and (2S, B, N) for bi-directional RNN where S is the number of layers and is equal to n_layers, B is the mini-batch size, and N is the dimension of the hidden units.
  • xs (list of ~chainer.Variable) – List of input sequences. Each element xs[i] is a chainer.Variable holding a sequence. Its shape is (L_t, I), where L_t is the length of a sequence for time t, and I is the size of the input and is equal to in_size.
Returns:

This function returns a tuple containing three elements, hy and ys.

  • hy is an updated hidden states whose shape is same as hx.
  • ys is a list of Variable . Each element ys[t] holds hidden states of the last layer corresponding to an input xs[t]. Its shape is (L_t, N) for uni-directional RNN and (L_t, 2N) for bi-directional RNN where L_t is the length of a sequence for time t, and N is size of hidden units.

Return type:

tuple

index(value[, start[, stop]]) → integer -- return first index of value.¶

Raises ValueError if the value is not present.

init_hx(xs)[source]¶
init_scope()[source]¶

Creates an initialization scope.

This method returns a context manager object that enables registration of parameters (and links for Chain) by an assignment. A Parameter object can be automatically registered by assigning it to an attribute under this context manager.

Example

In most cases, the parameter registration is done in the initializer method. Using the init_scope method, we can simply assign a Parameter object to register it to the link.

class MyLink(chainer.Link):
    def __init__(self):
        super().__init__()
        with self.init_scope():
            self.W = chainer.Parameter(0, (10, 5))
            self.b = chainer.Parameter(0, (5,))
insert(index, link)[source]¶

Insert a child link at the given index.

Parameters:
  • index (int) – The position of the list where the new
  • is inserted. (link) –
  • link (Link) – The link to be inserted.
links(skipself=False)[source]¶

Returns a generator of all links under the hierarchy.

Parameters:skipself (bool) – If True, then the generator skips this link and starts with the first child link.
Returns:A generator object that generates all links.
namedlinks(skipself=False)[source]¶

Returns a generator of all (path, link) pairs under the hierarchy.

Parameters:skipself (bool) – If True, then the generator skips this link and starts with the first child link.
Returns:A generator object that generates all (path, link) pairs.
namedparams(include_uninit=True)[source]¶

Returns a generator of all (path, param) pairs under the hierarchy.

Parameters:include_uninit (bool) – If True, it also generates uninitialized parameters.
Returns:A generator object that generates all (path, parameter) pairs. The paths are relative from this link.
params(include_uninit=True)[source]¶

Returns a generator of all parameters under the link hierarchy.

Parameters:include_uninit (bool) – If True, it also generates uninitialized parameters.
Returns:A generator object that generates all parameters.
pop([index]) → item -- remove and return item at index (default last).¶

Raise IndexError if list is empty or index is out of range.

register_persistent(name)[source]¶

Registers an attribute of a given name as a persistent value.

This is a convenient method to register an existing attribute as a persistent value. If name has been already registered as a parameter, this method removes it from the list of parameter names and re-registers it as a persistent value.

Parameters:name (str) – Name of the attribute to be registered.
remove(value)¶

S.remove(value) – remove first occurrence of value. Raise ValueError if the value is not present.

repeat(n_repeat, mode='init')[source]¶

Repeats this link multiple times to make a Sequential.

This method returns a Sequential object which has the same Link multiple times repeatedly. The mode argument means how to copy this link to repeat.

Example

You can repeat the same link multiple times to create a longer Sequential block like this:

class ConvBNReLU(chainer.Chain):

    def __init__(self):
        super(ConvBNReLU, self).__init__()
        with self.init_scope():
            self.conv = L.Convolution2D(
                None, 64, 3, 1, 1, nobias=True)
            self.bn = L.BatchNormalization(64)

    def forward(self, x):
        return F.relu(self.bn(self.conv(x)))

net = ConvBNReLU().repeat(16, mode='init')

The net object contains 16 blocks, each of which is ConvBNReLU. And the mode was init, so each block is re-initialized with different parameters. If you give copy to this argument, each block has same values for its parameters but its object ID is different from others. If it is share, each block is same to others in terms of not only parameters but also the object IDs because they are shallow-copied, so that when the parameter of one block is changed, all the parameters in the others also change.

Parameters:
  • n_repeat (int) – Number of times to repeat.
  • mode (str) – It should be either init, copy, or share. init means parameters of each repeated element in the returned Sequential will be re-initialized, so that all elements have different initial parameters. copy means that the parameters will not be re-initialized but object itself will be deep-copied, so that all elements have same initial parameters but can be changed independently. share means all the elements which consist the resulting Sequential object are same object because they are shallow-copied, so that all parameters of elements are shared with each other.
reverse()¶

S.reverse() – reverse IN PLACE

rnn(*args)[source]¶

Calls RNN function.

This function must be implemented in a child class.

serialize(serializer)[source]¶

Serializes the link object.

Parameters:serializer (AbstractSerializer) – Serializer object.
to_cpu()[source]¶

Copies parameter variables and persistent values to CPU.

This method does not handle non-registered attributes. If some of such attributes must be copied to CPU, the link implementation must override this method to do so.

Returns: self

to_gpu(device=None)[source]¶

Copies parameter variables and persistent values to GPU.

This method does not handle non-registered attributes. If some of such attributes must be copied to GPU, the link implementation must override this method to do so.

Parameters:device – Target device specifier. If omitted, the current device is used.

Returns: self

to_intel64()[source]¶

Copies parameter variables and persistent values to CPU.

zerograds()[source]¶

Initializes all gradient arrays by zero.

This method can be used for the same purpose of cleargrads, but less efficient. This method is left for backward compatibility.

Deprecated since version v1.15: Use cleargrads() instead.

Attributes

local_link_hooks¶

Ordered dictionary of registered link hooks.

Contrary to chainer.thread_local.link_hooks, which registers its elements to all functions, link hooks in this property are specific to this link.

n_cells¶

Returns the number of cells.

This function must be implemented in a child class.

n_weights = 6¶
update_enabled¶

True if at least one parameter has an update rule enabled.

use_bi_direction = False¶
within_init_scope¶

True if the current code is inside of an initialization scope.

See init_scope() for the details of the initialization scope.

xp¶

Array module for this link.

Depending on which of CPU/GPU this link is on, this property returns numpy or cupy.

Next Previous

© Copyright 2015, Preferred Networks, inc. and Preferred Infrastructure, inc. Revision e4a0b8fc.

Built with Sphinx using a theme provided by Read the Docs.
Read the Docs v: v5.0.0
Versions
latest
stable
v5.0.0
v5.0.0rc1
v5.0.0b4
v5.0.0b3
v5.0.0b2
v5.0.0b1
v5.0.0a1
v4.5.0
v4.4.0
v4.3.1
v4.3.0
v4.2.0
v4.1.0
v4.0.0
v4.0.0rc1
v3.5.0
v3.4.0
v3.3.0
v3.2.0
v3.1.0
v3.0.0
v2.1.0
v2.0.2
v1.24.0
v1.23.0
v1.22.0
v1.21.0
v1.20.0.1
v1.19.0
v1.18.0
v1.17.0
v1.16.0
v1.15.0.1
v1.14.0
v1.13.0
v1.12.0
v1.11.0
v1.10.0
v1.9.1
v1.8.2
v1.7.2
v1.6.2.1
v1.5.1
v1.4.1
v1.3.2
v1.2.0
v1.1.2
v1.0.1
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.