86 Commits (b56fc0c2af8a3dc8729237b5e1e6e4e4e5d45dfa)

Author SHA1 Message Date
  lizhenyu f17534af08 ps cache support sparse 5 years ago
  yangzhenzhang cbca482e59 delete useless parameter in pipeline parallel 5 years ago
  yangzhenzhang 7303c3d3b8 add group ckpt 5 years ago
  yangzhenzhang 9da3f9bec9 mini step grad accumulation 5 years ago
  lizhenyu 77c80e48da bugfix: ps cache data process performance decay 5 years ago
  mindspore-ci-bot 7648b3c119 !10717 [bugfix] server core dump after traning 5 years ago
  lizhenyu 7eb49cfce7 [bugfix] server core dump after traning 5 years ago
  Ziyan 2c3b99ce91 fix infer rank list typo 5 years ago
  Ziyan 660f578988 fix standalone prediction 5 years ago
  yanglf1121 918679daa3 add tensor.ndim and rename tensor.size() to tensor.size 5 years ago
  mindspore-ci-bot 1ff38be903 !9742 fix distribute predict 5 years ago
  Ziyan e7a24611f4 fix distribtued predict 5 years ago
  lizhenyu e3f7ae61db add ps cache manager 5 years ago
  Ziyan e7e9dae54d support distributed predict 5 years ago
  jinyaohui e6f9806cfb add broadcast 5 years ago
  Xiaoda Zhang aa13d6b1cd support for-loop in auto-parallel 5 years ago
  Xiaoda Zhang aa84484049 enabling approximation in DP algorithms 5 years ago
  Ziyan c33f2cd796 fix auto optimizer weight shard 5 years ago
  Yi Huaijie b28a6ff88e refactor seed interfaces 5 years ago
  Yi Huaijie c5f7700992 refactor get_seed() interface 5 years ago
  lichenever cfffff2875 add check for allreduce fusion 5 years ago
  Ziyan ddc0113058 enable parallel optimizer in auto parallel 5 years ago
  huangxinjing 2fa6a3b3c2 Fix doc error 5 years ago
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  mindspore-ci-bot 794f07bdc5 !6824 [AutoParallel]Add check for allreduce fusion 5 years ago
  lichenever 395d3f0848 add_limit_for_allreduce_fusion 5 years ago
  ZPaC 28c57f3f29 Change prefix for server ckpt callback 5 years ago
  mindspore-ci-bot 97e8742f84 !6500 Add default value for the autoparallel search mode 5 years ago
  mindspore-ci-bot cdff9412dc !6483 remove parameter broadcast 5 years ago
  huangxinjing 8ba1503135 Add default value for auto search parallel mode 5 years ago
  Ziyan cc131193ec remove parameter broadcast 5 years ago
  mindspore-ci-bot c8e9d46391 !6450 Change PS directory. 5 years ago
  ZPaC 0b49f0fb57 change PS dir 5 years ago
  yao_yf b70204c080 auto parallel context add notes and func mv 5 years ago
  mindspore-ci-bot 5a76bd717d !6185 fix api comments 5 years ago
  Ziyan 8ea177e614 fix_api_problems 5 years ago
  Yi Huaijie e4cd67596f raise RuntimeError when using full_batch neither under semi_auto_parallel nor auto_parallel 5 years ago
  mindspore-ci-bot 76e544fab6 !5894 enhance ops API comment part3 5 years ago
  simson e7f3a283fc enhance ops API comment part3 5 years ago
  ZPaC 87bf2a7dcd Add PS context. 5 years ago
  lichenever f2d3fd34ce rectification_allreduce_fusion_api 5 years ago
  yao_yf d4cfe55c04 rename mirror_mean to gradients_mean 5 years ago
  Xiaoda Zhang 42f1241270 remove 'multi-subgraphs' to internal 5 years ago
  yao_yf 8f7aa5bd5a auto parallel context modify 5 years ago
  mindspore-ci-bot 7d4f481884 !5017 remove internal interface in wide&deep 5 years ago
  yao_yf a9a8e323b2 remove internal interface in wide&deep 5 years ago
  Yi Huaijie 394be43492 raise RuntimeError when set different mode after Initializer created 5 years ago
  ZPaC 830172201a Fix multi server precision error. 5 years ago
  Yi Huaijie 89a4ebf8a1 parallel mode must be set before create an initializer 5 years ago