LeakyReLU class

The Leaky Rectified Linear Unit (Leaky ReLU) activation function.

This is a variant of the standard ReLU function. Instead of being zero for negative inputs, LeakyReLU has a small negative slope (alpha), which helps prevent the "Dying ReLU" problem and can lead to more robust training.

The function is defined as $f(x) = x if $x > 0$, and $f(x) = \alpha \cdot x$ if $x \le 0$. The alpha value is a small constant, typically 0.01.

Example

// A hidden layer using LeakyReLU with a custom slope.
Layer hiddenLayer = DenseLayer(128, activation: LeakyReLU(alpha: 0.02));
Implemented types

Constructors

LeakyReLU({double alpha = 0.01})

Properties

alpha double
The small slope for negative inputs.
final
hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

call(Tensor input) Tensor<Vector>
Applies the Leaky ReLU function element-wise to the input tensor.
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited