Swish class

The Sigmoid-weighted Linear Unit (SiLU) activation function, also known as Swish.

This is a smooth, non-monotonic function that often outperforms ReLU on deeper models. It is "self-gated," as it uses the sigmoid function to gate the input.

The function is defined as $f(x) = x \cdot \sigma(x)$, where $\sigma$ is the sigmoid function.

Example

Layer hiddenLayer = DenseLayer(128, activation: Swish());
Implemented types

Constructors

Swish()

Properties

hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

call(Tensor input) Tensor<Vector>
Applies the Swish function element-wise to the input tensor.
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited