simplified_dropconnect(x, W, b=None, ratio=0.5, train=True, mask=None, use_batchwise_mask=True)¶
Linear unit regularized by simplified dropconnect.
Simplified dropconnect drops weight matrix elements randomly with probability
ratioand scales the remaining elements by factor
1 / (1 - ratio). It accepts two or three arguments: an input minibatch
x, a weight matrix
W, and optionally a bias vector
b. It computes \(Y = xW^\top + b\).
In testing mode, zero will be used as simplified dropconnect ratio instead of
Notice: This implementation cannot be used for reproduction of the paper. There is a difference between the current implementation and the original one. The original version uses sampling with gaussian distribution before passing activation function, whereas the current implementation averages before activation.
- x (
Variableor N-dimensional array) – Input variable. Its first dimension
nis assumed to be the minibatch dimension. The other dimensions are treated as concatenated one dimension whose size must be
- W (
Variableor N-dimensional array) – Weight variable of shape
- b (
Variableor N-dimensional array) – Bias variable (optional) of shape
- ratio (float) – Dropconnect ratio.
- train (bool) – If
True, executes simplified dropconnect. Otherwise, simplified dropconnect function works as a linear function.
- mask (None or chainer.Variable or numpy.ndarray or cupy.ndarray) – If
None, randomized dropconnect mask is generated. Otherwise, The mask must be
(n, M, N)or
(M, N)shaped array, and use_batchwise_mask is ignored. Main purpose of this option is debugging. mask array will be used as a dropconnect mask.
- use_batchwise_mask (bool) – If
True, dropped connections depend on each sample in mini-batch.
Li, W., Matthew Z., Sixin Z., Yann L., Rob F. (2013). Regularization of Neural Network using DropConnect. International Conference on Machine Learning. URL
- x (