npdl.activation
¶
Non-linear activation functions for artificial neurons.
Activations¶
Sigmoid () |
Sigmoid activation function \(\varphi(x) = \frac{1}{1 + e^{-x}}\) |
Tanh () |
Tanh activation function \(\varphi(x) = \tanh(x)\) |
ReLU () |
Rectify activation function \(\varphi(x) = \max(0, x)\) |
Linear () |
Linear activation function \(\varphi(x) = x\) |
Softmax () |
Softmax activation function \(\varphi(\mathbf{x})_j = \frac{e^{\mathbf{x}_j}}{\sum_{k=1}^K e^{\mathbf{x}_k}}\) where \(K\) is the total number of neurons in the layer. |
Detailed description¶
-
class
npdl.activation.
Sigmoid
[source]¶ Sigmoid activation function \(\varphi(x) = \frac{1}{1 + e^{-x}}\)
Parameters: x : float32
The activation (the summed, weighted input of a neuron).
Returns: float32 in [0, 1]
The output of the sigmoid function applied to the activation.
-
class
npdl.activation.
Tanh
[source]¶ Tanh activation function \(\varphi(x) = \tanh(x)\)
Parameters: x : float32
The activation (the summed, weighted input of a neuron).
Returns: float32 in [-1, 1]
The output of the tanh function applied to the activation.
-
class
npdl.activation.
ReLU
[source]¶ Rectify activation function \(\varphi(x) = \max(0, x)\)
Parameters: x : float32
The activation (the summed, weighted input of a neuron).
Returns: float32
The output of the rectify function applied to the activation.
-
class
npdl.activation.
Linear
[source]¶ Linear activation function \(\varphi(x) = x\)
Parameters: x : float32
The activation (the summed, weighted input of a neuron).
Returns: float32
The output of the identity applied to the activation.
-
class
npdl.activation.
Softmax
[source]¶ Softmax activation function \(\varphi(\mathbf{x})_j = \frac{e^{\mathbf{x}_j}}{\sum_{k=1}^K e^{\mathbf{x}_k}}\) where \(K\) is the total number of neurons in the layer. This activation function gets applied row-wise.
Parameters: x : float32
The activation (the summed, weighted input of a neuron).
Returns: float32 where the sum of the row is 1 and each single value is in [0, 1]
The output of the softmax function applied to the activation.