69 Commits (bc738d4ec3e73f4e25a3631f71b9833fe6830e47)

Author SHA1 Message Date
  guohongzilong e70b2f5430 add optimizer.get_lr_parameter() method 6 years ago
  guohongzilong c95215bca0 seperate lr groups and weight_decay groups 6 years ago
  guohongzilong 0b8cea8018 learning rate and weight decay support group mode 6 years ago
  guohongzilong 824bc30a94 learning rate and weight decay support group mode 6 years ago
  wangnan39@huawei.com 7f602016f4 add parameter verification for rmsprop, and modify default value in annotation 6 years ago
  liubuyu 672244e0ac add keep_bn_fp32 parameter 6 years ago
  leilei_snow 9e2ec3b8d8 check the legal value of weight_decay and loss_scale 6 years ago
  mindspore-ci-bot 31a12009dd !418 support parameter update for vm 6 years ago
  wangnan39@huawei.com b812b18c02 support update parameter for vm 6 years ago
  leilei_snow 9b28d9bd4a Add comment about int type. 6 years ago
  mindspore-ci-bot c3ec9712e0 !403 add cell class name to error message 6 years ago
  Ziyan f182edfd44 fix lars base class type 6 years ago
  fary86 8cbbbd950e Add cell name to error message 6 years ago
  zhangz0911gm 4ba6f7884d Fixing problem issues including class slice example cannot run, adding an example for class SigmoidCrossEntropyWithLogits etc. 6 years ago
  leilei_snow c4d0bb266a fix optimizer.decay_weight bug 6 years ago
  root 7d700295f8 add dynamic lr and enhance optim 6 years ago
  buxue 149839952b normalize log in optimizer in python 6 years ago
  seatea 6c03542eec Fix dtype bug for loss_scale and weight_decay. 6 years ago
  zhunaipan 930a1fb0a8 initial version 6 years ago