ReLU class

An activation function that applies the Rectified Linear Unit (ReLU) to a Vector.

ReLU is the most common activation function for hidden layers. It is defined as $f(x) = \max(0, x)$, outputting the input if it is positive and zero otherwise.

This version is designed to work on 1D Vector inputs.

Implemented types

Constructors

ReLU()

Properties

hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

call(Tensor input) Tensor<Vector>
Applies the ReLU function element-wise to the input tensor.
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited