81 Commits (b56fc0c2af8a3dc8729237b5e1e6e4e4e5d45dfa)

Author SHA1 Message Date
  huangbingjian b56fc0c2af [auto-monad] Do not insert VirtualDiv after UpdateState 5 years ago
  He Wei 7d9a783993 [auto-monad] Support side-effects by auto-monad 5 years ago
  mindspore-ci-bot b189f177bb Change tuple_getitem to TupleGetItem and some other ops, merge from r1.1 to master 5 years ago
  mindspore-ci-bot 9fa0499fa0 Change GatherV2 to Gather r1.1 to master 5 years ago
  yangzhenzhang 7303c3d3b8 add group ckpt 5 years ago
  yangzhenzhang 9da3f9bec9 mini step grad accumulation 5 years ago
  Ziyan 98566ddc07 enable gradients mean in opt shard 5 years ago
  lichenever 39306d64fb opt_pipeline_split 5 years ago
  mindspore-ci-bot b67aaf6773 !9832 expose_allgather_fusion_to_users 5 years ago
  Ziyan bbf8ec82b9 expose allgather fusion interface to users 5 years ago
  lichenever 9595502278 optimizer_pipline_split 5 years ago
  yao_yf 19fe28cb9b hange strategys of last nodes in eval/predict at auto parallel mode 5 years ago
  Xiaoda Zhang e78228603b move parallel-related black-list to core/ir, and fix the cloneCNode bug 5 years ago
  lizhenyu e3f7ae61db add ps cache manager 5 years ago
  Xiaoda Zhang 14d4926cf0 simplifying step-auto-parallel 5 years ago
  mindspore-ci-bot 4bb7303463 !9424 [PipelineSplit]Fix Shared Parameter bug 5 years ago
  mindspore-ci-bot 6b9e402790 !9396 enable allgather fusion 5 years ago
  lichenever 818e920f02 fix_pipeline_split_param_shared_bug 5 years ago
  Ziyan e29f5c96cb enable_allgather_fusion 5 years ago
  lichenever 78e131cf15 pipeline_split adapt parallel 5 years ago
  huangxinjing f2d5f14e37 Fix review bot 5 years ago
  yangzhenzhang 278e82a849 update pipeline parallel 5 years ago
  yangzhenzhang d4d6c4beae update get device list in parallel ops 5 years ago
  mindspore-ci-bot 2bf165c0b4 !8557 run cast before allgather in parallel optimzier 5 years ago
  yangzhenzhang 0c2c76d037 update get rank in parallel ops 5 years ago
  Ziyan 0ddb754edb run cast before parallel optimizer 5 years ago
  yangzhenzhang 0a79ab82ae add parallel ops 5 years ago
  Yi Huaijie d7faa77b5e support int64 shape 5 years ago
  lichenever 7c7006f347 fix bug if input not used 5 years ago
  Ziyan c33f2cd796 fix auto optimizer weight shard 5 years ago
  huangxinjing bf5d21770a Add UnsortedSegmentSum and UnosrtedSemenMin For Oparallel Implements 5 years ago
  mindspore-ci-bot 78f795971b !7818 add recursion limit for FindParallelCareNode 5 years ago
  mindspore-ci-bot 08e6ac0b09 !7805 support split ValueList 5 years ago
  Yi Huaijie 3102c4ff8d support split ValueList 5 years ago
  yangzhenzhang 92d02b7aff add recursion limit 5 years ago
  yao_yf 65d8e63580 set last node data parallel or repeat calculate in eval/predict 5 years ago
  Ziyan 069318899a refactor get cnode strategy 5 years ago
  Xiaoda Zhang fba2bfeb54 overwrite strategies for star graph structure 5 years ago
  Ziyan adc92496e8 disable comm fusion in parallel optimizer temp 5 years ago
  lichenever cfffff2875 add check for allreduce fusion 5 years ago
  Ziyan ddc0113058 enable parallel optimizer in auto parallel 5 years ago
  lichenever 6dd2c75948 fix_auto_parallel_find_loss_bug 5 years ago
  lichenever 23a38aa1dc semi_auto_support_gpt2 5 years ago
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  lichenever e2c8a0bbc5 support_gpt2_compile_graph 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  Yi Huaijie 6066b16838 implement parallel Pack 5 years ago
  lichenever d4bba3f1d2 fix_auto_parallel_find_loss_bug 5 years ago
  mindspore-ci-bot 754f2b774c !6409 add batch parallel info black list 5 years ago
  yao_yf b70204c080 auto parallel context add notes and func mv 5 years ago