103 Commits (4ff2ff123ca02beef62e53c5df4ef240f4cf7bbf)

Author SHA1 Message Date
  Xiaoda Zhang 07e1e39a82 fix some codestyle warnings 4 years ago
  chendongsheng 1a72c6ac35 added scale out and scale in 5 years ago
  ZPaC 768f6b40c3 Optimize ps doc 4 years ago
  huangxinjing e79db658e8 Fix codex for python file 4 years ago
  ZPaC 19a2585ba4 Optimize server log 5 years ago
  chendongsheng 2aae0b01ec tcp support ssl 5 years ago
  mindspore-ci-bot ee885b4e58 !15180 enable not fully use opt shard 5 years ago
  ZPaC a6f9814552 Add Server part 3 5 years ago
  Ziyan 2a752f24bf enable not fully use opt shard 5 years ago
  Ziyan b3eebea4de fix shard strategy for batchnorm 5 years ago
  Ziyan 27c7c8618e fix get seed validation 5 years ago
  mindspore-ci-bot 444ff97206 !13505 typo fix 5 years ago
  Ziyan 6e475bea3f add warning for parameter share among multi devices 5 years ago
  Ziyan d19d42ee44 modify grad accu and comm fusion api 5 years ago
  wudenggang b17a558af4 typo fix 5 years ago
  liujunzhu 6541b96c40 Add communication parallel mode. 5 years ago
  chendongsheng 6c22dc0d55 added worker 5 years ago
  lizhenyu f17534af08 ps cache support sparse 5 years ago
  yangzhenzhang cbca482e59 delete useless parameter in pipeline parallel 5 years ago
  yangzhenzhang 7303c3d3b8 add group ckpt 5 years ago
  yangzhenzhang 9da3f9bec9 mini step grad accumulation 5 years ago
  lizhenyu 77c80e48da bugfix: ps cache data process performance decay 5 years ago
  mindspore-ci-bot 7648b3c119 !10717 [bugfix] server core dump after traning 5 years ago
  lizhenyu 7eb49cfce7 [bugfix] server core dump after traning 5 years ago
  Ziyan 2c3b99ce91 fix infer rank list typo 5 years ago
  Ziyan 660f578988 fix standalone prediction 5 years ago
  yanglf1121 918679daa3 add tensor.ndim and rename tensor.size() to tensor.size 5 years ago
  mindspore-ci-bot 1ff38be903 !9742 fix distribute predict 5 years ago
  Ziyan e7a24611f4 fix distribtued predict 5 years ago
  lizhenyu e3f7ae61db add ps cache manager 5 years ago
  Ziyan e7e9dae54d support distributed predict 5 years ago
  jinyaohui e6f9806cfb add broadcast 5 years ago
  Xiaoda Zhang aa13d6b1cd support for-loop in auto-parallel 5 years ago
  Xiaoda Zhang aa84484049 enabling approximation in DP algorithms 5 years ago
  Ziyan c33f2cd796 fix auto optimizer weight shard 5 years ago
  Yi Huaijie b28a6ff88e refactor seed interfaces 5 years ago
  Yi Huaijie c5f7700992 refactor get_seed() interface 5 years ago
  lichenever cfffff2875 add check for allreduce fusion 5 years ago
  Ziyan ddc0113058 enable parallel optimizer in auto parallel 5 years ago
  huangxinjing 2fa6a3b3c2 Fix doc error 5 years ago
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  mindspore-ci-bot 794f07bdc5 !6824 [AutoParallel]Add check for allreduce fusion 5 years ago
  lichenever 395d3f0848 add_limit_for_allreduce_fusion 5 years ago
  ZPaC 28c57f3f29 Change prefix for server ckpt callback 5 years ago
  mindspore-ci-bot 97e8742f84 !6500 Add default value for the autoparallel search mode 5 years ago
  mindspore-ci-bot cdff9412dc !6483 remove parameter broadcast 5 years ago
  huangxinjing 8ba1503135 Add default value for auto search parallel mode 5 years ago
  Ziyan cc131193ec remove parameter broadcast 5 years ago
  mindspore-ci-bot c8e9d46391 !6450 Change PS directory. 5 years ago