Expand description
Optimizer module.
Modules§
- Adaptor module for optimizers.
- Weight decay module for optimizers.
- Momentum module for optimizers.
- Record module for optimizers.
Structs§
- AdaGrad optimizer
- AdaGrad configuration.
- AdaGrad state.
- The record item type for the module.
- Adam optimizer as described in the paper Adam: A Method for Stochastic Optimization.
- Adam configuration.
- Adam state.
- The record item type for the module.
- AdamW optimizer as described in the paper Decoupled Weight Decay Regularization, Loshchilov and Hutter, 2019.
- AdamW configuration.
- AdamW state.
- The record item type for the module.
- Adaptive momentum state.
- The record item type for the module.
- Adaptive momentum state.
- The record item type for the module.
- CenteredState is to store and pass optimizer step params.
- The record item type for the module.
- Accumulate gradients into a single Gradients object.
- Data type that contains gradients for parameters.
- Learning rate decay state (also includes sum state).
- The record item type for the module.
- Optimizer that implements stochastic gradient descent with momentum. The optimizer can be configured with RmsPropConfig.
- Configuration to create the RmsProp optimizer.
- RmsPropMomentum is to store config status for optimizer. (, which is stored in optimizer itself and not passed in during
step()
calculation) - RmsPropMomentumState is to store and pass optimizer step params.
- The record item type for the module.
- State of RmsProp
- The record item type for the module.
- Optimizer that implements stochastic gradient descent with momentum.
- Configuration to create the Sgd optimizer.
- State of Sgd.
- The record item type for the module.
- SquareAvgState is to store and pass optimizer step params.
- The record item type for the module.
Traits§
- General trait to optimize module.
- Simple optimizer is an opinionated trait to simplify the process of implementing an optimizer.