Browse Source

!6593 modify lr comment

Merge pull request !6593 from lijiaqi/lr_api
tags/v1.0.0
mindspore-ci-bot Gitee 5 years ago
parent
commit
abb2134f55
1 changed files with 6 additions and 0 deletions
  1. +6
    -0
      mindspore/nn/learning_rate_schedule.py

+ 6
- 0
mindspore/nn/learning_rate_schedule.py View File

@@ -62,10 +62,12 @@ class ExponentialDecayLR(LearningRateSchedule):
decayed\_learning\_rate[i] = learning\_rate * decay\_rate^{p}

Where :

.. math::
p = \frac{current\_step}{decay\_steps}

If `is_stair` is True, the formula is :

.. math::
p = floor(\frac{current\_step}{decay\_steps})

@@ -116,10 +118,12 @@ class NaturalExpDecayLR(LearningRateSchedule):
decayed\_learning\_rate[i] = learning\_rate * e^{-decay\_rate * p}

Where :

.. math::
p = \frac{current\_step}{decay\_steps}

If `is_stair` is True, the formula is :

.. math::
p = floor(\frac{current\_step}{decay\_steps})

@@ -171,10 +175,12 @@ class InverseDecayLR(LearningRateSchedule):
decayed\_learning\_rate[i] = learning\_rate / (1 + decay\_rate * p)

Where :

.. math::
p = \frac{current\_step}{decay\_steps}

If `is_stair` is True, The formula is :

.. math::
p = floor(\frac{current\_step}{decay\_steps})



Loading…
Cancel
Save