|
|
|
@@ -185,7 +185,7 @@ class LazyAdam(Optimizer): |
|
|
|
weight_decay (Union[float, int]): Weight decay (L2 penalty). Default: 0.0. |
|
|
|
loss_scale (float): A floating point value for the loss scale. Should be equal to or greater than 1. In general, |
|
|
|
use the default value. Only when `FixedLossScaleManager` is used for training and the `drop_overflow_update` |
|
|
|
in `FixedLossScaleManager` is set to False, then this value needs to be the same as the `loss_scale` in |
|
|
|
in `FixedLossScaleManager` is set to False, then this value needs to be the same as the `loss_scale` in |
|
|
|
`FixedLossScaleManager`. Refer to class :class:`mindspore.FixedLossScaleManager` for more details. |
|
|
|
Default: 1.0. |
|
|
|
|
|
|
|
|