LeakyReLUOp class

Leaky Rectified Linear Unit activation function.

Applies x if x > 0, else negativeSlope * x. Equivalent to F.leaky_relu() in PyTorch.

final result = LeakyReLUOp(negativeSlope: 0.1)(tensor);
Inheritance
Mixed-in types

Constructors

LeakyReLUOp({double negativeSlope = 0.01})
Creates a Leaky ReLU operation with the given negativeSlope.

Properties

hashCode int
The hash code for this object.
no setterinherited
name String
The human-readable name of this operation.
no setteroverride
negativeSlope double
The slope for negative values. Default is 0.01.
final
requiresContiguous bool
Whether this operation requires contiguous input.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

apply(TensorBuffer input) TensorBuffer
Applies this transform to input and returns the result.
override
applyInPlace(TensorBuffer input) → void
Applies this transform to input in place.
override
call(TensorBuffer input) TensorBuffer
Alias for apply.
inherited
cloneForModification(TensorBuffer input) TensorBuffer
Creates an output buffer from input, ensuring contiguity with single copy.
inherited
computeOutputShape(List<int> inputShape) List<int>
Computes the output shape for a given inputShape.
override
ensureContiguous(TensorBuffer input) TensorBuffer
Returns a contiguous version of input if needed.
inherited
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited