98 Commits (6fcd6cab684bd2d2ae2da5613efa3aa4e2cd7d5a)

Author SHA1 Message Date
  chendongsheng 2aae0b01ec tcp support ssl 4 years ago
  mindspore-ci-bot ee885b4e58 !15180 enable not fully use opt shard 4 years ago
  ZPaC a6f9814552 Add Server part 3 4 years ago
  Ziyan 2a752f24bf enable not fully use opt shard 5 years ago
  Ziyan b3eebea4de fix shard strategy for batchnorm 4 years ago
  Ziyan 27c7c8618e fix get seed validation 5 years ago
  mindspore-ci-bot 444ff97206 !13505 typo fix 5 years ago
  Ziyan 6e475bea3f add warning for parameter share among multi devices 5 years ago
  Ziyan d19d42ee44 modify grad accu and comm fusion api 5 years ago
  wudenggang b17a558af4 typo fix 5 years ago
  liujunzhu 6541b96c40 Add communication parallel mode. 5 years ago
  chendongsheng 6c22dc0d55 added worker 5 years ago
  lizhenyu f17534af08 ps cache support sparse 5 years ago
  yangzhenzhang cbca482e59 delete useless parameter in pipeline parallel 5 years ago
  yangzhenzhang 7303c3d3b8 add group ckpt 5 years ago
  yangzhenzhang 9da3f9bec9 mini step grad accumulation 5 years ago
  lizhenyu 77c80e48da bugfix: ps cache data process performance decay 5 years ago
  mindspore-ci-bot 7648b3c119 !10717 [bugfix] server core dump after traning 5 years ago
  lizhenyu 7eb49cfce7 [bugfix] server core dump after traning 5 years ago
  Ziyan 2c3b99ce91 fix infer rank list typo 5 years ago
  Ziyan 660f578988 fix standalone prediction 5 years ago
  yanglf1121 918679daa3 add tensor.ndim and rename tensor.size() to tensor.size 5 years ago
  mindspore-ci-bot 1ff38be903 !9742 fix distribute predict 5 years ago
  Ziyan e7a24611f4 fix distribtued predict 5 years ago
  lizhenyu e3f7ae61db add ps cache manager 5 years ago
  Ziyan e7e9dae54d support distributed predict 5 years ago
  jinyaohui e6f9806cfb add broadcast 5 years ago
  Xiaoda Zhang aa13d6b1cd support for-loop in auto-parallel 5 years ago
  Xiaoda Zhang aa84484049 enabling approximation in DP algorithms 5 years ago
  Ziyan c33f2cd796 fix auto optimizer weight shard 5 years ago
  Yi Huaijie b28a6ff88e refactor seed interfaces 5 years ago
  Yi Huaijie c5f7700992 refactor get_seed() interface 5 years ago
  lichenever cfffff2875 add check for allreduce fusion 5 years ago
  Ziyan ddc0113058 enable parallel optimizer in auto parallel 5 years ago
  huangxinjing 2fa6a3b3c2 Fix doc error 5 years ago
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  mindspore-ci-bot 794f07bdc5 !6824 [AutoParallel]Add check for allreduce fusion 5 years ago
  lichenever 395d3f0848 add_limit_for_allreduce_fusion 5 years ago
  ZPaC 28c57f3f29 Change prefix for server ckpt callback 5 years ago
  mindspore-ci-bot 97e8742f84 !6500 Add default value for the autoparallel search mode 5 years ago
  mindspore-ci-bot cdff9412dc !6483 remove parameter broadcast 5 years ago
  huangxinjing 8ba1503135 Add default value for auto search parallel mode 5 years ago
  Ziyan cc131193ec remove parameter broadcast 5 years ago
  mindspore-ci-bot c8e9d46391 !6450 Change PS directory. 5 years ago
  ZPaC 0b49f0fb57 change PS dir 5 years ago
  yao_yf b70204c080 auto parallel context add notes and func mv 5 years ago
  mindspore-ci-bot 5a76bd717d !6185 fix api comments 5 years ago
  Ziyan 8ea177e614 fix_api_problems 5 years ago
  Yi Huaijie e4cd67596f raise RuntimeError when using full_batch neither under semi_auto_parallel nor auto_parallel 5 years ago