25 Commits (8ee4d8e92d02ff255ab42d521075b974adb105a0)

Author SHA1 Message Date
  Wei Luning d4d6457ea7 init parameter data by defaultOnly keep no data as MetaTensor in auto parallel mode 5 years ago
  zhousiyi 7d31deb6fa remove loss_scale range check to make FP32Imm(inf) comparison equal 5 years ago
  BowenK 6d4c07c886 Update python pattern expression 5 years ago
  wangnan39@huawei.com 082433183d uniform learning_rate behavior of optimizers 5 years ago
  changzherui f4cb445ea8 syn code for 0715 5 years ago
  BowenK f267a105b8 Add Python Pass UT 5 years ago
  changzherui 17da929b82 syn code 0706 5 years ago
  guohongzilong 652093642e change order param same as group params 5 years ago
  jinyaohui dd5fba1db9 add notice 5 years ago
  yanghaoran 21c381b366 sync from mindspore to incubator 5 years ago
  fary86 15b3fba0ef Fix eliminate get ref parameter 5 years ago
  mindspore-ci-bot 10fd781b15 !1831 Add order parameter function in group params 5 years ago
  guohongzilong 85a06b00c6 add order function in group params 5 years ago
  jinyaohui 5e43edc474 clean pylint 5 years ago
  jonyguo 228061818c Merge branch 'incubator-master' into sync_05177ff9_6b1715a7 5 years ago
  jinyaohui 86d197dfeb clean pylint 5 years ago
  jonyguo 78cc0d5d8d Merge branch 'incubator-master' into sync_d9c74e0a 5 years ago
  buxue e490618db8 support tensor get value by tensor index 5 years ago
  mindspore-ci-bot a2a3b1c6c5 !1089 Add Optimizer method get_lr_parameter 5 years ago
  guohongzilong e70b2f5430 add optimizer.get_lr_parameter() method 5 years ago
  jinyaohui 26fd75895d pylint waring clean 5 years ago
  guohongzilong c95215bca0 seperate lr groups and weight_decay groups 5 years ago
  guohongzilong 0b8cea8018 learning rate and weight decay support group mode 5 years ago
  guohongzilong 824bc30a94 learning rate and weight decay support group mode 5 years ago
  zhunaipan 930a1fb0a8 initial version 5 years ago