optimizers/optimizers library

Classes

Adagrad
Implements the Adagrad optimizer.
Adam
AdamMatrix
Implements the Adam optimizer for 2D Matrix parameters.
AdamW
Implements the AdamW optimizer.
AMSGrad
Implements the AMSGrad optimizer.
Momentum
Implements Stochastic Gradient Descent (SGD) with Momentum.
NAG
Implements the Nesterov Accelerated Gradient (NAG) optimizer.
Optimizer
The abstract base class for all optimization algorithms.
RMSprop
Implements the RMSprop optimizer.
SGD
Implements the Stochastic Gradient Descent (SGD) optimizer.