ELU class

An activation function that applies the Exponential Linear Unit (ELU) to a Vector.

ELU is an alternative to ReLU that has a small negative value for negative inputs, which can help prevent the "Dying ReLU" problem and speed up learning.

The function is defined as $f(x) = x if $x > 0$, and $f(x) = \alpha(e^x - 1)$ if $x \le 0$.

Implemented types

Constructors

ELU({double alpha = 1.0})

Properties

alpha double
Controls the saturation point for negative inputs.
final
hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

call(Tensor input) Tensor<Vector>
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited