activationFunctions/activation_funciton
library
Classes
-
ActivationFunction
-
The abstract base class (or interface) for all activation functions.
-
ELU
-
An activation function that applies the Exponential Linear Unit (ELU) to a Vector.
-
ELUMatrix
-
An activation function that applies the Exponential Linear Unit (ELU) to a Matrix.
-
LeakyReLU
-
The Leaky Rectified Linear Unit (Leaky ReLU) activation function.
-
Mish
-
The Mish activation function.
-
ReLU
-
An activation function that applies the Rectified Linear Unit (ReLU) to a Vector.
-
ReLUMatrix
-
An activation function that applies the Rectified Linear Unit (ReLU) to a Matrix.
-
Sigmoid
-
The Sigmoid activation function.
-
SigmoidMatrix
-
An activation function that applies the Sigmoid function to a Matrix.
-
Softmax
-
An activation function that applies Softmax to a Vector.
-
SoftmaxMatrix
-
An activation function that applies Softmax to each row of a Matrix.
-
Swish
-
The Sigmoid-weighted Linear Unit (SiLU) activation function, also known as Swish.