67 Commits (bada826b18f3e4dc8aac0f82af3457794de9c7bb)

Author SHA1 Message Date
  Yi Huaijie c5f7700992 refactor get_seed() interface 5 years ago
  lichenever cfffff2875 add check for allreduce fusion 5 years ago
  Ziyan ddc0113058 enable parallel optimizer in auto parallel 5 years ago
  huangxinjing 2fa6a3b3c2 Fix doc error 5 years ago
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  mindspore-ci-bot 794f07bdc5 !6824 [AutoParallel]Add check for allreduce fusion 5 years ago
  lichenever 395d3f0848 add_limit_for_allreduce_fusion 5 years ago
  ZPaC 28c57f3f29 Change prefix for server ckpt callback 5 years ago
  mindspore-ci-bot 97e8742f84 !6500 Add default value for the autoparallel search mode 5 years ago
  mindspore-ci-bot cdff9412dc !6483 remove parameter broadcast 5 years ago
  huangxinjing 8ba1503135 Add default value for auto search parallel mode 5 years ago
  Ziyan cc131193ec remove parameter broadcast 5 years ago
  mindspore-ci-bot c8e9d46391 !6450 Change PS directory. 5 years ago
  ZPaC 0b49f0fb57 change PS dir 5 years ago
  yao_yf b70204c080 auto parallel context add notes and func mv 5 years ago
  mindspore-ci-bot 5a76bd717d !6185 fix api comments 5 years ago
  Ziyan 8ea177e614 fix_api_problems 5 years ago
  Yi Huaijie e4cd67596f raise RuntimeError when using full_batch neither under semi_auto_parallel nor auto_parallel 5 years ago
  mindspore-ci-bot 76e544fab6 !5894 enhance ops API comment part3 5 years ago
  simson e7f3a283fc enhance ops API comment part3 5 years ago
  ZPaC 87bf2a7dcd Add PS context. 5 years ago
  lichenever f2d3fd34ce rectification_allreduce_fusion_api 5 years ago
  yao_yf d4cfe55c04 rename mirror_mean to gradients_mean 5 years ago
  Xiaoda Zhang 42f1241270 remove 'multi-subgraphs' to internal 5 years ago
  yao_yf 8f7aa5bd5a auto parallel context modify 5 years ago
  mindspore-ci-bot 7d4f481884 !5017 remove internal interface in wide&deep 5 years ago
  yao_yf a9a8e323b2 remove internal interface in wide&deep 5 years ago
  Yi Huaijie 394be43492 raise RuntimeError when set different mode after Initializer created 5 years ago
  ZPaC 830172201a Fix multi server precision error. 5 years ago
  Yi Huaijie 89a4ebf8a1 parallel mode must be set before create an initializer 5 years ago
  yao_yf cbb4363fa7 remove to_full_tensor and load_inputs in exexute stage 5 years ago
  yangzhenzhang 4a0e6ff7fc update field split 5 years ago
  yuchaojie 64a1560f1a add allreduce group for resnet gpu version 5 years ago
  Ziyan 98e2ee90de fix optimizer parallel problems 5 years ago
  yuchaojie ed9cf2036c add nccl default allreduce fusion group 5 years ago
  mindspore-ci-bot edec821c50 !2876 set reshape operator no redistribution for auto parallel 5 years ago
  lirongzhen1 c1eba79b83 set reshape redistribution strategy attribute to no redistribution 5 years ago
  Ziyan 39f08eb7dd enable optimizer parallel 5 years ago
  He Wei f337c6bc14 Decouple ParamValue from python 5 years ago
  Ziyan 0925e35252 enable optimizer parallel with broadcast 5 years ago
  chenjianping 343889cdb7 building _ms_mpi with mpi_interface 5 years ago
  mindspore-ci-bot a64e0de389 !2366 [CT][MS][Auto-Parallel] Recursive interface needs to be changed to external interface 5 years ago
  hongxing d798325127 supplement description about default value 5 years ago
  mindspore-ci-bot 9cddaee9b3 !2323 [CT][MS][Auto-Parallel]Recursive interface needs to be changed to external interface 5 years ago
  hongxing 3ad3a71fc7 change interface 5 years ago
  Yi Huaijie eae69a386a delete attribute seed of Initializer 5 years ago
  chenjianping 6034f9c1e2 support host reduce scatter and mpi config 5 years ago
  mindspore-ci-bot 4df861cb62 !1672 support load full dataset on each device 5 years ago
  kswang 699166e552 default fusion group for ge 5 years ago