96 Commits (c94dea6a512eddb6cbe8b591268d82d7b9aa3209)

Author SHA1 Message Date
  mindspore-ci-bot d04f3b9a49 !2748 Change order param only equal to group param 5 years ago
  guohongzilong 652093642e change order param same as group params 5 years ago
  wangnan39@huawei.com 68bd5cf6a1 add cpu sparse optimizer ops with no return 5 years ago
  wangnan39@huawei.com 172728a6a6 support weight decay for sparse optimizer 5 years ago
  lilei 12e330cd20 fix group param not list 5 years ago
  mindspore-ci-bot ca1d2436b8 !2286 enable optimizer parallel with broadcast 5 years ago
  mindspore-ci-bot 4c4586ea6f !2399 fix param KeyError in group params 5 years ago
  Ziyan 0925e35252 enable optimizer parallel with broadcast 5 years ago
  guohongzilong 90639a2a44 fix params KeyError in group params 5 years ago
  mindspore-ci-bot 0478b7d191 !2303 optimize LARS interface 5 years ago
  Ziyan 41ddc153a6 modify lars interface 5 years ago
  liangzelang 76ba5a643f fix bug of type cast 5 years ago
  gong chen a6dfa281ea Init GraphKernel. 5 years ago
  mindspore-ci-bot 60cd188ab8 !2381 fix some type cast bug 5 years ago
  liangzelang bbfab3ed7c fix some type cast bug 5 years ago
  panyifeng 3c2057297e support multi param for tuple grad 5 years ago
  lilei 497067d7b2 add sparse proximal ada grad optimizer 5 years ago
  guohongzilong 1702bdfc21 change multitpefungraph to internal interface 5 years ago
  liuxiao df63a3195d fix input value check for SparseApplyFtrl and SparseApplyAdagrad 5 years ago
  mindspore-ci-bot 2d84011504 !2071 optimizer support loss scale for sparse situation 5 years ago
  mindspore-ci-bot 5aeba82af3 !2112 add warmup_steps param check in AdamWeightDecayDynamicLR optimizer 5 years ago
  yoonlee666 799f24b2d1 add warmup_steps param check 5 years ago
  mindspore-ci-bot 000e3672ba !2110 update proximal_ada_grad optimizer learning_rate 5 years ago
  wangnan39@huawei.com d4e3d69f37 support loss scale for sparse situation 5 years ago
  lilei 7e4bdf6add proximal_ada_grad optimizer 5 years ago
  mindspore-ci-bot 3536185f5b !2007 add lazy adam optimizer and support sparse adam&ftrl for cpu backend 5 years ago
  wangnan39@huawei.com 4042f16ce4 add lazy adam optim and support sparse adam & ftrl for cpu backend 5 years ago
  liuxiao 52790b74e6 Add some description to API about optimizer. 5 years ago
  lilei 36d9e353a5 add proximal_ada_grad optimizer 5 years ago
  mindspore-ci-bot 5499161531 !1862 fixed validator for ApplyRMSProp,CumProd, CumSum,ReduceProd etc 5 years ago
  mindspore-ci-bot 10fd781b15 !1831 Add order parameter function in group params 5 years ago
  mindspore-ci-bot 994b1d83c9 !1890 Fix some bugs for issue. 5 years ago
  mindspore-ci-bot eaaf824f18 !1896 fix lars weight decay computation error 5 years ago
  jiangjinsheng 51affc2f1b fixed validator for CumProd, ReduceProd, ApplyRMSProp 5 years ago
  Ziyan 94b78fdf1b fix_lars_computation_error 5 years ago
  liuxiao 26c231b734 fix some bugs for issues. 5 years ago
  guohongzilong 85a06b00c6 add order function in group params 5 years ago
  mindspore-ci-bot 1c640face9 !1826 fix bug when check learning_rate in AdamWeightDecayDynamicLR 5 years ago
  jiangjinsheng eb4571a67f fixed LeakyReLU, Optimizer 5 years ago
  wangnan39@huawei.com c9b7d95c2c fix lr check bug in AdamWeightDecayDynamicLR 5 years ago
  shibeiji 178952afbc modify adam optimizer and script of bert to match the patterns of fusion rule 5 years ago
  mindspore-ci-bot 0426ed057b !1777 Remove ZerosLikeTensor and sub with ZerosTensor 5 years ago
  BowenK 96379faa3a Remove ZerosLikeTensor and sub with ZerosLike 5 years ago
  yoonlee666 994b16c4cc adjust warmup_steps in AdamWeightDecayDynamicLR 5 years ago
  mindspore-ci-bot e8b14d70c7 !1542 [pynative] fix `MultitypeFuncGraph` and `HyperMap` in pynative mode 5 years ago
  Wei Luning ebf02dd528 fix `MultitypeFuncGraph` and `HyperMap` in pynative mode 5 years ago
  mindspore-ci-bot 2a6a3e012c !1555 fix bug in lamb warmup step check 5 years ago
  wangnan39@huawei.com 810ccf80d8 fix_bug_in_check_lamb_warmup_step 5 years ago
  chenhaozhe f65913d62a fix performance of bert 5 years ago
  guohongzilong 8585b55a65 add check for group parameters 5 years ago