29 Commits (5cccfbc61ba4e67de63eecacd564373b7ddb0e3a)

Author SHA1 Message Date
  Yi Huaijie 2eb739de6e change HostAllGather and HostReduceScatter to internal interface 5 years ago
  Shida He 4c056855e0 Implementation for mindspore debugger 5 years ago
  Xiaoda Zhang 3ff6e336c6 check cast from optimizer in auto-parallel 5 years ago
  mindspore-ci-bot a663f2066c !2285 [Code Review] code review fix 5 years ago
  lichenever 563622874a update 5 years ago
  jjfeing c26274f324 fix code review bug 5 years ago
  jjfeing caab25e09b tbe select broadcast reduce dynamic 5 years ago
  lichenever e0e055a0b8 add sparse gatherv2 5 years ago
  kingfo 38436f929f move hook function to primtivePy class 5 years ago
  Xiaoda Zhang 1cfb52bc0e add the reshape part of the embeddinglookup backward operator 5 years ago
  mindspore-ci-bot d5f55f0820 !1769 [AutoParallel]GatherV2_support_host_device 5 years ago
  BowenK 96379faa3a Remove ZerosLikeTensor and sub with ZerosLike 5 years ago
  lichenever 1437966c98 gatherv2_support_host_and_device 5 years ago
  yangzhenzhang 19bd830539 support forward reduce scatter for matmul 5 years ago
  Yi Huaijie 75ca84d260 INFO user when set_strategy not under [semi_]auto_parallel mode 5 years ago
  lichenever 19a24b86ac add gatherv2 distributed op 5 years ago
  yao_yf 5a6540450e use param name as the key of strategy checkpoint 5 years ago
  ougongchang 0ed6d9178e add Histogram summary operator 6 years ago
  yangzhenzhang 36ffb66782 add parallel op for square 6 years ago
  yangzhenzhang 57cd9f8188 add parallel op for sigmoidloss 6 years ago
  yangzhenzhang 6d522f0a4f add parallel op for layernorm 6 years ago
  mindspore-ci-bot 39a46b9342 !245 Add bool type check in communication operator 6 years ago
  c00425699 d62f560b50 add_bool_type_check_in_comm_op 6 years ago
  lichenever b81cc6ea4f add minimum distributed op 6 years ago
  yangzhenzhang b34c0e7a17 add parallel op for dropoutdomask 6 years ago
  yangzhenzhang dd0d4e6b84 add parallel ops for expand dims 6 years ago
  lichenever ff808021c7 register not equal distributed op 6 years ago
  yangzhenzhang 110640e2ad add parallel ops for neg and batchmatmul 6 years ago
  zhunaipan 930a1fb0a8 initial version 6 years ago