Browse Source

!2866 modify the loss scale annotation

Merge pull request !2866 from lilei/modify_loss_scale_annotation
tags/v0.6.0-beta
mindspore-ci-bot Gitee 5 years ago
parent
commit
f245b22133
1 changed files with 1 additions and 1 deletions
  1. +1
    -1
      mindspore/nn/optim/proximal_ada_grad.py

+ 1
- 1
mindspore/nn/optim/proximal_ada_grad.py View File

@@ -71,7 +71,7 @@ class ProximalAdagrad(Optimizer):
l1 (float): l1 regularization strength, must be greater than or equal to zero. Default: 0.0.
l2 (float): l2 regularization strength, must be greater than or equal to zero. Default: 0.0.
use_locking (bool): If True use locks for update operation. Default: False.
loss_scale (float): Value for the loss scale. It should be equal to or greater than 1.0. Default: 1.0.
loss_scale (float): Value for the loss scale. It should be greater than 0.0. Default: 1.0.
wegith_decay (float): Weight decay value to multiply weight, must be zero or positive value. Default: 0.0.
Inputs:


Loading…
Cancel
Save