chainer.functions.simplified_dropconnect(x, W, b=None, ratio=0.5, train=True, mask=None, use_batchwise_mask=True)[source]

Linear unit regularized by simplified dropconnect.

Simplified dropconnect drops weight matrix elements randomly with probability ratio and scales the remaining elements by factor 1 / (1 - ratio). It accepts two or three arguments: an input minibatch x, a weight matrix W, and optionally a bias vector b. It computes \(Y = xW^\top + b\).

In testing mode, zero will be used as simplified dropconnect ratio instead of ratio.

Notice: This implementation cannot be used for reproduction of the paper. There is a difference between the current implementation and the original one. The original version uses sampling with gaussian distribution before passing activation function, whereas the current implementation averages before activation.

  • x (Variable or N-dimensional array) – Input variable. Its first dimension n is assumed to be the minibatch dimension. The other dimensions are treated as concatenated one dimension whose size must be N.

  • W (Variable or N-dimensional array) – Weight variable of shape (M, N).

  • b (Variable or N-dimensional array) – Bias variable (optional) of shape (M,).

  • ratio (float) – Dropconnect ratio.

  • train (bool) – If True, executes simplified dropconnect. Otherwise, simplified dropconnect function works as a linear function.

  • mask (None or Variable or N-dimensional array) – If None, randomized dropconnect mask is generated. Otherwise, The mask must be (n, M, N) or (M, N) shaped array, and use_batchwise_mask is ignored. Main purpose of this option is debugging. mask array will be used as a dropconnect mask.

  • use_batchwise_mask (bool) – If True, dropped connections depend on each sample in mini-batch.


Output variable.

Return type


See also


See also

Li, W., Matthew Z., Sixin Z., Yann L., Rob F. (2013). Regularization of Neural Network using DropConnect. International Conference on Machine Learning. URL