wangnan39@huawei.com
|
ddc558fd72
|
fix weight decay error in optimizer AdamWeightDecay
|
5 years ago |
mindspore-ci-bot
|
c52466d9bd
|
!691 fix bug of Acosh, TopK, ResizeNearestNeighbor, DepthwiseConv2dNative.
Merge pull request !691 from zhangbuxue/fix_bug_of_ops
|
5 years ago |
buxue
|
9cb71441ea
|
fix bugs of Acosh, TopK, ResizeNearestNeighbor, DepthwiseConv2dNative
|
5 years ago |
liuxiao
|
5e877a7715
|
modify api and add example
|
5 years ago |
jinyaohui
|
219bc18439
|
clean pylint
|
5 years ago |
liubuyu
|
672244e0ac
|
add keep_bn_fp32 parameter
|
5 years ago |
leilei_snow
|
9e2ec3b8d8
|
check the legal value of weight_decay and loss_scale
|
5 years ago |
mindspore-ci-bot
|
31a12009dd
|
!418 support parameter update for vm
Merge pull request !418 from wangnan39/support_parameter_update_for_vm
|
6 years ago |
wangnan39@huawei.com
|
b812b18c02
|
support update parameter for vm
|
6 years ago |
leilei_snow
|
9b28d9bd4a
|
Add comment about int type.
|
6 years ago |
mindspore-ci-bot
|
c3ec9712e0
|
!403 add cell class name to error message
Merge pull request !403 from fary86/add_cell_name_to_error_message_for_nn_layer
|
6 years ago |
Ziyan
|
f182edfd44
|
fix lars base class type
|
6 years ago |
fary86
|
8cbbbd950e
|
Add cell name to error message
|
6 years ago |
zhangz0911gm
|
4ba6f7884d
|
Fixing problem issues including class slice example cannot run, adding an example for class SigmoidCrossEntropyWithLogits etc.
|
6 years ago |
leilei_snow
|
c4d0bb266a
|
fix optimizer.decay_weight bug
|
6 years ago |
root
|
7d700295f8
|
add dynamic lr and enhance optim
|
6 years ago |
buxue
|
149839952b
|
normalize log in optimizer in python
|
6 years ago |
seatea
|
ead50a2170
|
Define the default decay_filter for `Adam` optimizer.
|
6 years ago |
zhongligeng
|
144a636b51
|
resolve some issue in nn comments
|
6 years ago |
zhaoting
|
1b4041a8f1
|
add weight decay in RMSProp optimizer
|
6 years ago |
zhaoting
|
ed3c2d7229
|
add RMSProp optimizer
|
6 years ago |
mindspore-ci-bot
|
352c6faf85
|
!18 enable use float type learning rate in lars optimizer
Merge pull request !18 from gziyan/master
|
6 years ago |
Ziyan
|
4cbcd8e907
|
enable use float type learning rate in lars optimizer
|
6 years ago |
seatea
|
6c03542eec
|
Fix dtype bug for loss_scale and weight_decay.
1.Change dtype of scale to dtype of grad in loss_scale.py;
2.Change dtype of weight_decay to dtype of weight in optimizer.py.
|
6 years ago |
zhunaipan
|
930a1fb0a8
|
initial version
Signed-off-by: leonwanghui <leon.wanghui@huawei.com>
|
6 years ago |