Package version
Known values of StochasticOptimizer that the service accepts.
Adam is algorithm the optimizes stochastic objective functions based on adaptive estimates of moments
AdamW is a variant of the optimizer Adam that has an improved implementation of weight decay.
No optimizer selected.
Stochastic Gradient Descent optimizer.
Generated using TypeDoc
Known values of StochasticOptimizer that the service accepts.