Browse Source

delete extra whitespace in comments of lazyadam

pull/15061/head
wangnan39@huawei.com 4 years ago
parent
commit
adf24f9ba0
1 changed files with 1 additions and 1 deletions
  1. +1
    -1
      mindspore/nn/optim/lazyadam.py

+ 1
- 1
mindspore/nn/optim/lazyadam.py View File

@@ -185,7 +185,7 @@ class LazyAdam(Optimizer):
weight_decay (Union[float, int]): Weight decay (L2 penalty). Default: 0.0. weight_decay (Union[float, int]): Weight decay (L2 penalty). Default: 0.0.
loss_scale (float): A floating point value for the loss scale. Should be equal to or greater than 1. In general, loss_scale (float): A floating point value for the loss scale. Should be equal to or greater than 1. In general,
use the default value. Only when `FixedLossScaleManager` is used for training and the `drop_overflow_update` use the default value. Only when `FixedLossScaleManager` is used for training and the `drop_overflow_update`
in `FixedLossScaleManager` is set to False, then this value needs to be the same as the `loss_scale` in
in `FixedLossScaleManager` is set to False, then this value needs to be the same as the `loss_scale` in
`FixedLossScaleManager`. Refer to class :class:`mindspore.FixedLossScaleManager` for more details. `FixedLossScaleManager`. Refer to class :class:`mindspore.FixedLossScaleManager` for more details.
Default: 1.0. Default: 1.0.




Loading…
Cancel
Save