Chainer
v3.1.0
  • Tutorial
  • Reference Manual
    • Core functionalities
    • Utilities
    • Assertion and Testing
    • Standard Function implementations
    • Standard Link implementations
      • Learnable connections
        • chainer.links.Bias
        • chainer.links.Bilinear
        • chainer.links.ChildSumTreeLSTM
        • chainer.links.Convolution2D
        • chainer.links.ConvolutionND
        • chainer.links.Deconvolution2D
        • chainer.links.DeconvolutionND
        • chainer.links.DepthwiseConvolution2D
        • chainer.links.DilatedConvolution2D
        • chainer.links.EmbedID
        • chainer.links.GRU
        • chainer.links.Highway
        • chainer.links.Inception
        • chainer.links.InceptionBN
        • chainer.links.Linear
        • chainer.links.LSTM
        • chainer.links.MLPConvolution2D
        • chainer.links.NaryTreeLSTM
        • chainer.links.NStepBiGRU
        • chainer.links.NStepBiLSTM
        • chainer.links.NStepBiRNNReLU
        • chainer.links.NStepBiRNNTanh
        • chainer.links.NStepGRU
        • chainer.links.NStepLSTM
        • chainer.links.NStepRNNReLU
        • chainer.links.NStepRNNTanh
        • chainer.links.Scale
        • chainer.links.StatefulGRU
        • chainer.links.StatelessGRU
        • chainer.links.StatefulPeepholeLSTM
        • chainer.links.StatelessLSTM
      • Activation/loss/normalization functions with parameters
        • chainer.links.BatchNormalization
        • chainer.links.LayerNormalization
        • chainer.links.BinaryHierarchicalSoftmax
        • chainer.links.BlackOut
        • chainer.links.CRF1d
        • chainer.links.SimplifiedDropconnect
        • chainer.links.PReLU
        • chainer.links.Maxout
        • chainer.links.NegativeSampling
      • Machine learning models
        • chainer.links.Classifier
      • Pre-trained models
        • VGG16Layers
        • GoogLeNet
        • Residual Networks
        • Compatibility with other frameworks
    • Optimizers
    • Serializers
    • Function hooks
    • Weight Initializers
    • Dataset examples
    • Iterator examples
    • Trainer extensions
    • Trainer triggers
    • Caffe Reference Model Support
    • Visualization of Computational Graph
    • Environment variables

Development

  • API Compatibility Policy
  • Contribution Guide

Misc

  • Installation Guide
  • Tips and FAQs
  • Upgrade Guide from v1 to v2
  • Comparison with Other Frameworks
  • License
Chainer
  • Docs »
  • Reference Manual »
  • Standard Link implementations
  • Edit on GitHub

Standard Link implementations¶

Chainer provides many Link implementations in the chainer.links package.

Note

Some of the links are originally defined in the chainer.functions namespace. They are still left in the namespace for backward compatibility, though it is strongly recommended to use them via the chainer.links package.

Learnable connections¶

chainer.links.Bias Broadcasted elementwise summation with learnable parameters.
chainer.links.Bilinear Bilinear layer that performs tensor multiplication.
chainer.links.ChildSumTreeLSTM Child-Sum TreeLSTM unit.
chainer.links.Convolution2D Two-dimensional convolutional layer.
chainer.links.ConvolutionND N-dimensional convolution layer.
chainer.links.Deconvolution2D Two dimensional deconvolution function.
chainer.links.DeconvolutionND N-dimensional deconvolution function.
chainer.links.DepthwiseConvolution2D Two-dimensional depthwise convolutional layer.
chainer.links.DilatedConvolution2D Two-dimensional dilated convolutional layer.
chainer.links.EmbedID Efficient linear layer for one-hot input.
chainer.links.GRU Stateful Gated Recurrent Unit function (GRU)
chainer.links.Highway Highway module.
chainer.links.Inception Inception module of GoogLeNet.
chainer.links.InceptionBN Inception module of the new GoogLeNet with BatchNormalization.
chainer.links.Linear Linear layer (a.k.a. fully-connected layer).
chainer.links.LSTM Fully-connected LSTM layer.
chainer.links.MLPConvolution2D Two-dimensional MLP convolution layer of Network in Network.
chainer.links.NaryTreeLSTM N-ary TreeLSTM unit.
chainer.links.NStepBiGRU Stacked Bi-directional GRU for sequences.
chainer.links.NStepBiLSTM Stacked Bi-directional LSTM for sequences.
chainer.links.NStepBiRNNReLU Stacked Bi-directional RNN for sequences.
chainer.links.NStepBiRNNTanh Stacked Bi-directional RNN for sequences.
chainer.links.NStepGRU Stacked Uni-directional GRU for sequences.
chainer.links.NStepLSTM Stacked Uni-directional LSTM for sequences.
chainer.links.NStepRNNReLU Stacked Uni-directional RNN for sequences.
chainer.links.NStepRNNTanh Stacked Uni-directional RNN for sequences.
chainer.links.Scale Broadcasted elementwise product with learnable parameters.
chainer.links.StatefulGRU Stateful Gated Recurrent Unit function (GRU).
chainer.links.StatelessGRU Stateless Gated Recurrent Unit function (GRU).
chainer.links.StatefulPeepholeLSTM Fully-connected LSTM layer with peephole connections.
chainer.links.StatelessLSTM Stateless LSTM layer.

Activation/loss/normalization functions with parameters¶

chainer.links.BatchNormalization Batch normalization layer on outputs of linear or convolution functions.
chainer.links.LayerNormalization Layer normalization layer on outputs of linear functions.
chainer.links.BinaryHierarchicalSoftmax Hierarchical softmax layer over binary tree.
chainer.links.BlackOut BlackOut loss layer.
chainer.links.CRF1d Linear-chain conditional random field loss layer.
chainer.links.SimplifiedDropconnect Fully-connected layer with simplified dropconnect regularization.
chainer.links.PReLU Parametric ReLU function as a link.
chainer.links.Maxout Fully-connected maxout layer.
chainer.links.NegativeSampling Negative sampling loss layer.

Machine learning models¶

chainer.links.Classifier A simple classifier model.

Pre-trained models¶

Pre-trained models are mainly used to achieve a good performance with a small dataset, or extract a semantic feature vector. Although CaffeFunction automatically loads a pre-trained model released as a caffemodel, the following link models provide an interface for automatically converting caffemodels, and easily extracting semantic feature vectors.

For example, to extract the feature vectors with VGG16Layers, which is a common pre-trained model in the field of image recognition, users need to write the following few lines:

from chainer.links import VGG16Layers
from PIL import Image

model = VGG16Layers()
img = Image.open("path/to/image.jpg")
feature = model.extract([img], layers=["fc7"])["fc7"]

where fc7 denotes a layer before the last fully-connected layer. Unlike the usual links, these classes automatically load all the parameters from the pre-trained models during initialization.

VGG16Layers¶

chainer.links.VGG16Layers A pre-trained CNN model with 16 layers provided by VGG team.
chainer.links.model.vision.vgg.prepare Converts the given image to the numpy array for VGG models.

GoogLeNet¶

chainer.links.GoogLeNet A pre-trained GoogLeNet model provided by BVLC.
chainer.links.model.vision.googlenet.prepare Converts the given image to the numpy array for GoogLeNet.

Residual Networks¶

chainer.links.model.vision.resnet.ResNetLayers A pre-trained CNN model provided by MSRA.
chainer.links.ResNet50Layers A pre-trained CNN model with 50 layers provided by MSRA.
chainer.links.ResNet101Layers A pre-trained CNN model with 101 layers provided by MSRA.
chainer.links.ResNet152Layers A pre-trained CNN model with 152 layers provided by MSRA.
chainer.links.model.vision.resnet.prepare Converts the given image to the numpy array for ResNets.

Compatibility with other frameworks¶

chainer.links.TheanoFunction Theano function wrapper.
chainer.links.caffe.CaffeFunction Caffe emulator based on the model file of Caffe.
Next Previous

© Copyright 2015, Preferred Networks, inc. and Preferred Infrastructure, inc.. Revision 0f4b5f2f.

Built with Sphinx using a theme provided by Read the Docs.
Read the Docs v: v3.1.0
Versions
latest
stable
v4.0.0b1
v4.0.0a1
v3.1.0
v3.0.0
v3
v2.0.2
v2.0.1
v2.0.0
v1.24.0
v1.23.0
v1.22.0
v1.21.0
v1.20.0.1
v1.19.0
v1.18.0
v1.17.0
v1.16.0
v1.15.0.1
v1.14.0
v1.13.0
v1.12.0
v1.11.0
v1.10.0
v1.9.1
v1.8.2
v1.7.2
v1.6.2.1
v1.5.1
v1.4.1
v1.3.2
v1.2.0
v1.1.2
v1.0.1
variable-document
v2-docs-cupy
simple-rtd
math-roman
fix-document
deprecate-zerograd
batch_norm_cudnn2
add-try-trainer-class-tutorial
Downloads
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.