ELU class
An activation function that applies the Exponential Linear Unit (ELU) to a Vector.
ELU is an alternative to ReLU that has a small negative value for negative
inputs, which can help prevent the "Dying ReLU" problem and speed up learning.
The function is defined as $f(x) = x if $x > 0$, and $f(x) = \alpha(e^x - 1)$ if $x \le 0$.
- Implemented types
Properties
Methods
-
call(
Tensor input) → Tensor< Vector> -
override
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited