39 Commits (b56fc0c2af8a3dc8729237b5e1e6e4e4e5d45dfa)

Author SHA1 Message Date
  He Wei 7d9a783993 [auto-monad] Support side-effects by auto-monad 5 years ago
  shibeiji b2d98d2751 add tbe fusion operators LambApplyOptimizerAssign and LambApplyWeightAssign for lamb optimizer 5 years ago
  dayschan a661c3dd40 fix warning for enable_graph_kernel context in CPU device 5 years ago
  lihongkang d499135f43 fix bugs 5 years ago
  lihongkang c27d27c46e fix bugs 5 years ago
  looop5 4d8205cd93 Delete unused interface in graph_kernels.py 5 years ago
  wangnan39@huawei.com 9b545e4982 fix bug in example of Lamb 5 years ago
  JunYuLiu 1eaa4a30dd Add labels to python files 5 years ago
  zhangz0911gm 3c8ca81ebe code_docs_updating_example_notes for nn folder 5 years ago
  Ziyan 17f2e6e756 fix api error and get next info 5 years ago
  mindspore-ci-bot c967bf6846 !7339 fix for se-resnet50 accurancy 5 years ago
  panfengfeng 2d7b93e958 fix nn & operations api comments 5 years ago
  mindspore-ci-bot 7152fe04be !5783 GraphKernel supports GPU 5 years ago
  dayschan 37a48f6aac GraphKernel supports GPU 5 years ago
  fary86 759748dc06 Fix bugs of adam and lamb optimizer 5 years ago
  simson 7cc48a9af8 Third round of enhancement of API comment & README_CN 5 years ago
  simson 3617121ccf revert modification of opt 5 years ago
  Ziyan 98e2ee90de fix optimizer parallel problems 5 years ago
  wangnan39@huawei.com 2b182633e9 delete annotation of decay filter in optimizers 5 years ago
  wangnan39@huawei.com 082433183d uniform learning_rate behavior of optimizers 5 years ago
  simson a1f789640b fix doc error of optim API 5 years ago
  Ziyan 0925e35252 enable optimizer parallel with broadcast 6 years ago
  gong chen a6dfa281ea Init GraphKernel. 6 years ago
  guohongzilong 1702bdfc21 change multitpefungraph to internal interface 5 years ago
  liuxiao 52790b74e6 Add some description to API about optimizer. 5 years ago
  wangnan39@huawei.com c9b7d95c2c fix lr check bug in AdamWeightDecayDynamicLR 6 years ago
  BowenK 96379faa3a Remove ZerosLikeTensor and sub with ZerosLike 6 years ago
  mindspore-ci-bot 2a6a3e012c !1555 fix bug in lamb warmup step check 6 years ago
  wangnan39@huawei.com 810ccf80d8 fix_bug_in_check_lamb_warmup_step 6 years ago
  chenhaozhe f65913d62a fix performance of bert 6 years ago
  mindspore-ci-bot deae380969 !637 Learning rate and weight decay making group params 6 years ago
  guohongzilong 824bc30a94 learning rate and weight decay support group mode 6 years ago
  simson bd2fd31ab3 revert limitation of end_learning_rate 6 years ago
  Wei Luning 157710ca0f bugfix* fix bug in output tuple of tuple.* check kRWWrite input no-variable* input x of ScatterNdUpdate should be a parameter node 6 years ago
  jinyaohui 73642ef3d3 clean pylint 6 years ago
  wangnan39@huawei.com ddc558fd72 fix weight decay error in optimizer AdamWeightDecay 6 years ago
  fary86 8cbbbd950e Add cell name to error message 6 years ago
  zhongligeng 144a636b51 resolve some issue in nn comments 6 years ago
  zhunaipan 930a1fb0a8 initial version 6 years ago