layertypes/layer library
Classes
- AveragePooling2DLayer
- A 2D average pooling layer.
- BatchNorm1D
- A 1D Batch Normalization layer.
- BatchNorm2D
- A 2D Batch Normalization layer.
- Conv2DLayer
- A 2D convolutional layer.
- ConvLSTMLayer
- A Convolutional Long Short-Term Memory (ConvLSTM) layer.
- DenseLayer
- A standard, fully-connected neural network layer for 1D Vector data.
- DenseLayerMatrix
- A fully-connected layer that operates on a batch of data (a Matrix).
- DropoutLayer
- A Dropout layer for regularization.
- DropoutLayerMatrix
- A Dropout layer for regularizing 2D Matrix data.
- DualLSTMLayer
- A Multi-Timeline Long Short-Term Memory (MT-LSTM) layer.
- FlattenLayer
- A utility layer that flattens a multi-dimensional tensor into a 1D vector.
- GeneralizedChainedScaleLayer
- A self-contained, chained, multi-scale recurrent layer.
- GlobalAveragePoolingLayer
-
A layer that reduces a sequence Matrix
seq_len, dModelto a single VectordModelby averaging across the sequence dimension. - Layer
- The abstract base class for all neural network layers.
- LSTMLayer
- A Long Short-Term Memory (LSTM) layer.
- MaxPooling1DLayer
- A 1D max pooling layer for sequence data.
- MaxPooling2DLayer
- A 2D max pooling layer.
- ReLULayer
- An activation layer that applies the Rectified Linear Unit (ReLU) function.
- ReLULayerMatrix
- An activation layer that applies ReLU element-wise to a Matrix.
- ReshapeVectorToMatrixLayer
- RNN
- A simple Recurrent Neural Network (RNN) layer.
- SingleHeadAttention
- Implements a single head of the self-attention mechanism.
Functions
-
globalAveragePooling(
Tensor< Matrix> input) → Tensor<Vector> - Computes the average of a Tensor
-
setTrainingMode(
SNetwork model, bool isTraining) → void -
Iterates through a model's layers and sets the
isTrainingflag on anyDropoutLayerinstances.