activationFunctions/activation_funciton library

Classes

ActivationFunction
The abstract base class (or interface) for all activation functions.
ELU
An activation function that applies the Exponential Linear Unit (ELU) to a Vector.
ELUMatrix
An activation function that applies the Exponential Linear Unit (ELU) to a Matrix.
LeakyReLU
The Leaky Rectified Linear Unit (Leaky ReLU) activation function.
Mish
The Mish activation function.
ReLU
An activation function that applies the Rectified Linear Unit (ReLU) to a Vector.
ReLUMatrix
An activation function that applies the Rectified Linear Unit (ReLU) to a Matrix.
Sigmoid
The Sigmoid activation function.
SigmoidMatrix
An activation function that applies the Sigmoid function to a Matrix.
Softmax
An activation function that applies Softmax to a Vector.
SoftmaxMatrix
An activation function that applies Softmax to each row of a Matrix.
Swish
The Sigmoid-weighted Linear Unit (SiLU) activation function, also known as Swish.

Functions

matrixELU(Tensor<Matrix> m, {double alpha = 1.0}) Tensor<Matrix>
Mathematical operation for the ELU function on a matrix.
softmax(Tensor<Vector> v) Tensor<Vector>
Mathematical operation for the Softmax function on a vector.
softmaxMatrix(Tensor<Matrix> m) Tensor<Matrix>
Mathematical operation for applying Softmax to each row of a matrix.
vectorELU(Tensor<Vector> v, {double alpha = 1.0}) Tensor<Vector>
Mathematical operation for the ELU function on a vector.
vectorLeakyReLU(Tensor<Vector> v, {double alpha = 0.01}) Tensor<Vector>
Mathematical operation for element-wise Leaky ReLU on a vector.
vectorMish(Tensor<Vector> x) Tensor<Vector>
Mathematical operation for the Mish function on a vector.
vectorSwish(Tensor<Vector> v) Tensor<Vector>
Mathematical operation for the Swish function on a vector.