Sigmoid class

The Sigmoid activation function.

It squashes any real-valued input into a range between 0 and 1. The function is defined as $f(x) = \frac{1}{1 + e^{-x}}$.

Because of this property, it is the standard activation function for the output layer in binary classification problems, where its output can be interpreted as a probability (e.g., the probability that an email is spam).

While historically popular, it is rarely used in hidden layers of modern networks as it can lead to the vanishing gradient problem. ReLU is generally preferred for hidden layers.

Example

// A binary classification output layer.
Layer outputLayer = DenseLayer(1, activation: Sigmoid());
Implemented types

Constructors

Sigmoid()

Properties

hashCode int
The hash code for this object.
no setterinherited
runtimeType Type
A representation of the runtime type of the object.
no setterinherited

Methods

call(Tensor input) Tensor<Vector>
Applies the Sigmoid function element-wise to the input tensor.
override
noSuchMethod(Invocation invocation) → dynamic
Invoked when a nonexistent method or property is accessed.
inherited
toString() String
A string representation of this object.
inherited

Operators

operator ==(Object other) bool
The equality operator.
inherited