chainer.functions.softplus

chainer.functions.softplus(x, beta=1.0)[source]

Element-wise softplus function.

The softplus function is the smooth approximation of ReLU.

\[f(x)=\frac{1}{\beta}\log(1 + \exp(\beta x)),\]

where \(\beta\) is a parameter. The function becomes curved and akin to ReLU as the \(\beta\) is increasing.

Parameters:
  • x (Variable or numpy.ndarray or cupy.ndarray) – Input variable. A \((s_1, s_2, ..., s_N)\)-shaped float array.
  • beta (float) – Parameter \(\beta\).
Returns:

Output variable. A \((s_1, s_2, ..., s_N)\)-shaped float array.

Return type:

Variable

Example

>>> x = np.arange(-2, 3, 2).astype('f')
>>> x
array([-2.,  0.,  2.], dtype=float32)
>>> F.softplus(x, beta=1.0).data
array([ 0.126928  ,  0.69314718,  2.12692809], dtype=float32)