DropoutLayer class
A Dropout layer for regularization.
Dropout is a technique used to prevent overfitting. During training, it
randomly sets a fraction of input units to 0 with a frequency of rate
at each step. This forces the network to learn more robust features.
IMPORTANT: This layer should only be active during training. During evaluation or inference, it should be disabled or bypassed, allowing all data to pass through unmodified.
This implementation uses "inverted dropout," where the outputs of the
non-dropped units are scaled up by 1 / (1 - rate). This ensures that
the expected output magnitude remains the same, and no changes are needed
at test time.
- Input: A
Tensor<Vector>orTensor<Matrix>. - Output: A tensor of the same shape as the input.
Example
SNetwork model = SNetwork([
DenseLayer(128, activation: ReLU()),
DropoutLayer(0.5), // Drops 50% of the inputs from the previous layer
DenseLayer(10),
]);
Constructors
- DropoutLayer(double rate)
Properties
- hashCode → int
-
The hash code for this object.
no setterinherited
- isTraining ↔ bool
-
getter/setter pair
- name ↔ String
-
A user-friendly name for the layer (e.g., 'dense', 'lstm').
getter/setter pairoverride-getter
-
parameters
→ List<
Tensor> -
A list of all trainable tensors (weights and biases) in the layer.
no setteroverride
- rate ↔ double
-
getter/setter pair
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
Methods
-
build(
Tensor input) → void -
Initializes the layer's parameters based on the shape of the first input.
inherited
-
call(
Tensor input) → Tensor -
The public, callable interface for the layer.
inherited
-
forward(
Tensor input) → Tensor< Vector> -
The core logic of the layer's transformation.
override
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited